WhatsApp is proving to have a severe problem with fake news, so the company is going to be paying researchers to study it in hopes that it can find solutions.
It’s not only Facebook or Twitter that have a problem with fake news and misinformation. WhatsApp is proving to have a particularly severe problem in India, where it is costing people their lives. It is so out of control that the “rumour-busting announcer” set by the Indian government was killed last week and “instances of lynching of innocent people have been noticed recently because of a large number of irresponsible and explosive messages filled with rumours and provocation are being circulated on WhatsApp.” The Indian Government has asked WhatsApp to take immediate action so that this stops.
So, under a new initiative, WhatsApp will be paying researchers up to $50,000 to study aspects of misinformation, election-related issues, and viral content. Researchers can make proposals to obtain the grants. In a statement, the company writes,
“We will seriously consider proposals from any social science and technological perspective that propose projects that enrich our understanding of the problem of misinformation on WhatsApp.”
The researchers won’t receive access to WhatsApp data, but will instead receive “guidance.” WhatsApp explains that “high priority areas” are the following:
Information processing of problematic content
Proposals that explore the social, cognitive and information processing variables involved in the consumption of content received on WhatsApp, its relation to the content’s credibility, and the decision to promote that content with others. This includes social cues and relationships, personal value systems, features of the content, content source, etc.
WhatsApp is interested in understanding what aspects of the experience might help individuals engage more critically with potentially problematic content.
Election-related information
Proposals that examine how political actors are leveraging WhatsApp to organize and potentially influence elections in their constituencies. WhatsApp is a powerful medium for political actors to connect and communicate with their constituents. However, it can also be misused to share inaccurate or inflammatory political content.
WhatsApp is interested in understanding this space both from the perspective of political actors and the voter base. This includes understanding the unique characteristics of WhatsApp for political activity and its place in the ecosystem of social media and messaging platforms, distribution channels for political content, targeting strategies, etc.
Network effects and virality
Proposals that explore the characteristics of networks and content. WhatsApp is designed to be a private, personal communication space and is not designed to facilitate trends or virality through algorithms or feedback. However, these behaviours do organically occur along social dimensions.
WhatsApp is interested in projects that inform its understanding of the spread of information through WhatsApp networks.
Digital literacy and misinformation
Proposals that explore the relation between digital literacy and vulnerability to misinformation on WhatsApp. WhatsApp is very popular in some emerging markets, and especially so among those new to the internet and populations with lower exposure to technology.
WhatsApp is interested in research that informs its efforts to bring technology safely and effectively into underserved geographical regions. This includes studies of individuals, families and communities, but also wider inquiries into factors that shape the context for the user experience online.
Detection of problematic behaviour within encrypted systems
Proposals that examine technical solutions to detecting problematic behaviour within the restrictions of and in keeping with the principles of encryption. WhatsApp’s end-to-end encrypted system facilitates privacy and security for all WhatsApp users, including people who might be using the platform for illegal activities. How might illegal activity be detected without monitoring the content of all users?
WhatsApp is particularly interested in understanding and deterring activities that facilitate the distribution of verifiably false information.
Applications are now open and will close August 12, 2018, 11:59 pm PST. Award recipients will be notified by email by September 14, 2018. Interested parties can find further information about the program here.
You might also like
More from WhatsApp
WhatsApp Set To Offer Health Insurance And Pension Products In India
WhatsApp is expected to move into a range of services in India within 2021 - including health insurance and pension …
With Carts WhatsApp Is Now A True E-Commerce Platform
WhatsApp Carts let potential customers of businesses on its platform browse catalogs, choose items, and check out via a single …
WhatsApp Is Rolling Out Its Shopping Button Globally
The new shopping button on WhatsApp makes it easier for you to browse a brand's catalogs of goods and services.
WhatsApp Rolls Out Payments Feature In India
WhatsApp announced it's rolling out payments in India, making it possible for users to send money through its platform.
WhatsApp Launches New Storage Management Tool
WhatsApp is updating its storage management tool, allowing you to easily find and delete files that are clogging up your …
WhatsApp Is Now Delivering 100 Billion Messages A Day
WhatsApp, the Facebook-owned instant messaging app, is now delivering 100 billion messages a day across the globe.
How To Mute A WhatsApp Chat Forever
WhatsApp last week finally announced the ability to mute chat conversations on its app... forever.
WhatsApp Announces ‘Search The Web’ Feature To Fight Misinformation
WhatsApp is piloting a way for users to double-check forwarded messages by uploading them through their browser - without WhatsApp …