WhatsApp is proving to have a severe problem with fake news, so the company is going to be paying researchers to study it in hopes that it can find solutions.
It’s not only Facebook or Twitter that have a problem with fake news and misinformation. WhatsApp is proving to have a particularly severe problem in India, where it is costing people their lives. It is so out of control that the “rumour-busting announcer” set by the Indian government was killed last week and “instances of lynching of innocent people have been noticed recently because of a large number of irresponsible and explosive messages filled with rumours and provocation are being circulated on WhatsApp.” The Indian Government has asked WhatsApp to take immediate action so that this stops.
So, under a new initiative, WhatsApp will be paying researchers up to $50,000 to study aspects of misinformation, election-related issues, and viral content. Researchers can make proposals to obtain the grants. In a statement, the company writes,
“We will seriously consider proposals from any social science and technological perspective that propose projects that enrich our understanding of the problem of misinformation on WhatsApp.”
The researchers won’t receive access to WhatsApp data, but will instead receive “guidance.” WhatsApp explains that “high priority areas” are the following:
Information processing of problematic content
Proposals that explore the social, cognitive and information processing variables involved in the consumption of content received on WhatsApp, its relation to the content’s credibility, and the decision to promote that content with others. This includes social cues and relationships, personal value systems, features of the content, content source, etc.
WhatsApp is interested in understanding what aspects of the experience might help individuals engage more critically with potentially problematic content.
Proposals that examine how political actors are leveraging WhatsApp to organize and potentially influence elections in their constituencies. WhatsApp is a powerful medium for political actors to connect and communicate with their constituents. However, it can also be misused to share inaccurate or inflammatory political content.
WhatsApp is interested in understanding this space both from the perspective of political actors and the voter base. This includes understanding the unique characteristics of WhatsApp for political activity and its place in the ecosystem of social media and messaging platforms, distribution channels for political content, targeting strategies, etc.
Network effects and virality
Proposals that explore the characteristics of networks and content. WhatsApp is designed to be a private, personal communication space and is not designed to facilitate trends or virality through algorithms or feedback. However, these behaviours do organically occur along social dimensions.
WhatsApp is interested in projects that inform its understanding of the spread of information through WhatsApp networks.
Digital literacy and misinformation
Proposals that explore the relation between digital literacy and vulnerability to misinformation on WhatsApp. WhatsApp is very popular in some emerging markets, and especially so among those new to the internet and populations with lower exposure to technology.
WhatsApp is interested in research that informs its efforts to bring technology safely and effectively into underserved geographical regions. This includes studies of individuals, families and communities, but also wider inquiries into factors that shape the context for the user experience online.
Detection of problematic behaviour within encrypted systems
Proposals that examine technical solutions to detecting problematic behaviour within the restrictions of and in keeping with the principles of encryption. WhatsApp’s end-to-end encrypted system facilitates privacy and security for all WhatsApp users, including people who might be using the platform for illegal activities. How might illegal activity be detected without monitoring the content of all users?
WhatsApp is particularly interested in understanding and deterring activities that facilitate the distribution of verifiably false information.
Applications are now open and will close August 12, 2018, 11:59 pm PST. Award recipients will be notified by email by September 14, 2018. Interested parties can find further information about the program here.
More from WhatsApp
Starting this week, WhatsApp video and audio calls just got easier with the power of Google Assistant.