--- layout: default title: "WhatsApp Needs to Do More to Curb Fake News and Rumours, Say Experts" description: "A Hindustan Times report compiling expert assessments of WhatsApp's response to government concerns about misinformation following mob lynchings, with divergent views on the adequacy of platform measures and government responsibility." categories: [Media mentions] date: 2018-07-06 source: "Hindustan Times" permalink: /media/whatsapp-fake-news-expert-assessments-hindustan-times/ created: 2025-12-30 --- **WhatsApp Needs to Do More to Curb Fake News and Rumours, Say Experts** is a *Hindustan Times* report published on 6 July 2018. The article presents assessments from policy and technology experts evaluating WhatsApp's proposed measures to address misinformation following government criticism, in the wake of at least eight states experiencing mob lynchings triggered by fake videos and child-lifting rumours circulated on the platform. ## Contents 1. [Article Details](#article-details) 2. [Full Text](#full-text) 3. [Context and Background](#context-and-background) 4. [External Link](#external-link) ## Article Details
đź“° Published in:
Hindustan Times
đź“… Date:
6 July 2018
đź“„ Type:
News Report
đź“° Newspaper Link:
Read Online
## Full Text

HT asked policy and tech experts to read the letter sent by WhatsApp to the government on measures listed to prevent the spread of false information in India, and whether the messaging service was doing enough. Here's what they said:


Instant messaging service WhatsApp has listed measures to prevent the spread of false information in India even as the government underlined the need for the Facebook-owned firm to do more while calling its response "prompt".

The measures were listed in a letter written in response to a government missive expressing "deep disapproval" about WhatsApp's inability to prevent the spread of "irresponsible and explosive material".

Fake videos and rumours of child-lifting circulated via WhatsApp have triggered lynchings in at least eight states.

Hindustan Times asked policy and tech experts to read the letter sent by WhatsApp to the government on Wednesday. They were asked — is Facebook-owned WhatsApp doing enough to curb the spread of misinformation on its platform?

Here is what they said:

Ananth Padmanabhan, fellow, Centre for Policy Research

Technology can never be perfectly regulated using legal instruments and coercion. It requires a combination of instruments including behavioural norms and technology features. To the extent technology features can be used to address the issue of misinformation, WhatsApp appears to be on the right track.

Reliance on deep learning tools is no different from how PayPal successfully addressed fraudulent transfers using pattern analysis. Digital literacy, addressed towards better behavioural norms, is again a long-term goal. A major gap in this letter (sent by WhatsApp to government) though is on the issue of suspending user accounts based on assessment of their behaviour. Any such policy needs careful thought as free speech concerns are implicated.

Apar Gupta, lawyer

No (they are not enough), misinformation and propaganda cannot be stopped at the platform level within WhatsApp without compromising privacy. It is important to consider this as it's an instant messaging platform and not a social media network with public posts. It seems as if the government is putting the onus of maintaining law and order on WhatsApp. We're noticing the failings of modern public broadcasting, which is inadequately funded and has witnessed political interference.

Sunil Abraham, founder, Centre for Internet and Society

The steps that WhatsApp has enlisted are not enough in terms of scale to address this problem in India. If it took partnerships with 24 media firms in Brazil, which has a population of slightly over 200 million and 97% of the population speaks in one language, then in India given the number of languages and the size of our population it will take at least a 1,000 such partnerships to bring about any meaningful change here.

Pratik Sinha, co-founder, Alt News

Ideally, WhatsApp should have taken these steps a while back. Since this is an election year, we see a lot of platforms checking boxes and making changes to their privacy policies. WhatsApp is a global company but it must understand each country they are present in on a case to case basis and implement changes accordingly. India is low on digital literacy and has a history of mob justice.

The company needs to use tools that can detect the source of origin of the original video or photograph. They have not laid enough emphasis on decision-making. Content going viral and the subsequent surge in traffic is good for business. WhatsApp needs to do more to help people find out whether a message is true or false thereby eliminating a large majority of misinformation. Having said that, the government is shirking its responsibility by throwing WhatsApp under the bus. Yes, encryption makes it difficult for law enforcement agencies to find the bad actors but lynching is not a new issue. The government has not taken steps to run awareness programs to tackle the problem.

{% include back-to-top.html %} ## Context and Background This article appeared during a period of acute crisis for WhatsApp in India. Between May and July 2018, at least 30 people had been killed in mob lynchings across eight states, with many incidents triggered by fabricated videos and rumours—particularly claims about child kidnappers—that spread rapidly through WhatsApp groups. The violence prompted unprecedented government pressure on the platform, culminating in a letter expressing "deep disapproval" and demanding action. WhatsApp's response, delivered in a letter to the government on 4 July 2018, outlined measures including introducing labels for forwarded messages, limiting forward capabilities, and deploying machine learning to detect abuse. The government acknowledged the response as "prompt" but maintained that more action was necessary. This article compiled expert reactions to assess whether WhatsApp's proposed interventions addressed the scale and nature of the misinformation crisis. The expert commentary revealed fundamental disagreements about where responsibility lay and what solutions were feasible. Ananth Padmanabhan's perspective emphasised multi-stakeholder approaches combining technology features, legal frameworks and digital literacy, whilst cautioning that account suspension policies could implicate free speech rights. This reflected debates about whether platforms should act as arbiters of content truthfulness and what due process protections users deserved. Apar Gupta's assessment highlighted a structural tension: WhatsApp's architecture as an encrypted instant messaging platform fundamentally differed from public social networks like Facebook or Twitter. Content moderation techniques applicable to platforms with publicly visible posts could not be implemented without compromising the end-to-end encryption that protected user privacy. His observation about "failings of modern public broadcasting" redirected attention to state capacity—suggesting that underfunded and politically compromised public media had created information voids that WhatsApp misinformation filled. Sunil Abraham's comment about scale illustrated the magnitude of India's linguistic and demographic complexity. WhatsApp's partnerships with 24 media organisations in Brazil—a country with one-sixth India's population and near-linguistic homogeneity—might require multiplication by a factor of 40 or more to achieve comparable coverage across India's 22 scheduled languages and numerous dialects. This mathematical reality suggested that fact-checking partnerships, whilst valuable, could not alone solve the problem at India's scale. Pratik Sinha's critique from Alt News, India's prominent fact-checking organisation, combined platform accountability with government responsibility. His observation that 2018 was an election year pointed to platforms' tendency to implement cosmetic changes during periods of regulatory scrutiny. The suggestion that WhatsApp needed tools to trace original sources of viral content directly contradicted the platform's encryption model, illustrating the technical impossibility of certain government demands. Simultaneously, Sinha argued that the government was deflecting from its own failure to conduct public awareness campaigns, noting that mob violence predated WhatsApp and required state intervention beyond platform regulation. The diverse expert perspectives captured in this article reflected broader global debates about platform responsibility for offline harms, the limits of content moderation in encrypted environments, and the respective roles of states, platforms and civil society in addressing misinformation. WhatsApp would subsequently implement several measures—including forward limits reduced to five chats in India, removal of quick forward buttons for media, and labels identifying frequently forwarded messages—but lynchings linked to misinformation would continue to occur, demonstrating the complexity of addressing information disorders through technological interventions alone. ## External Link - Read on Hindustan Times