Social Media Forwards and India’s Misinformation Burden
From rural villages to migration hubs, misinformation adapts to local vulnerabilities and calls for responses grounded in culture and trust.
View as PDF
SDG 4: Quality Education | SDG 16: Reduced Inequalities Institutions
Institutions: Ministry of Education | Ministry of Electronics and Information Technology
For Ami Kumar, misinformation did not first appear in headlines or on television. It arrived on a WhatsApp forward in his neighbourhood in northern India, promising benefits from a new government scheme.
“People believed it because it came from family members and neighbours,” Ami recalled. “In rural India, a forward from someone close felt more real than an official notice from Delhi.”
Surveys show that in India, social media and messaging apps are now more common sources of news, than print or television. Ami’s first encounter reflected that larger shift: trust flowed more through personal networks than through institutions.
Ami is co-founder of Contrails.ai, a start-up focused on building safer digital spaces, and co-founder of Social & Media Matters, an organisation known for awareness campaigns and research on online harms in India. He has served on Facebook’s Safety Advisory Board, collaborated with the Google News Initiative on fact-checking, and led digital literacy training in schools and colleges. This combination of grassroots engagement and technical development gives him a vantage point few others have on misinformation.
Classrooms and WhatsApp University
When Ami began working with schools and colleges, the pattern looked different, but the dynamics were similar. Young people were immersed in WhatsApp groups and short video reels.
“We called it ‘WhatsApp University,’ but the reality was not funny,” he said. “During training sessions I ran with the Google News Initiative, students admitted they trusted family forwards more than what teachers explained in class.”
Teachers often felt outmatched. A short video could spread overnight, while their lessons remained trapped within the classroom. International surveys in 2023 also found that large sections of the public admitted to sharing content without fact-checking – a reminder of the widening gap between the speed of digital misinformation and the slower pace of classroom teaching.
Rumours of Migration
In Kerala, misinformation grew from a different source: aspiration. Families often saved for years to send one member abroad, usually to the Gulf, and that hope created fertile ground for scams.
Messages promised quick visas and overseas placements, packaged with familiar tricks – official logos, urgent timelines, and just enough detail to seem credible.
“The financial loss was serious, but many also spoke of the shame of being misled,” Ami recalled.
Studies of recruitment in India have noted how aspiring migrants, especially unskilled workers, are often exploited by informal agents charging high fees. Recent reports show how sub-agents in Indian towns and villages continue to drive fraudulent recruitment and visa scams. For Ami, those patterns were not academic abstractions but daily realities in migration hubs.
Limits of Fact-Checking
Across settings, Ami found the same gap: the people most exposed to falsehoods were the least likely to see fact-checks.
“Most fact-checking content is in English, heavy with statistics, and circulated on Twitter, WhatsApp and other social media,” he said. “The people who need it most will never encounter it.”
Research on Indian news use echoes this point: English-language content and elite platforms reach only a narrow segment of audience, while rural populations rely heavily on WhatsApp and community groups. This helps explain why corrections often fail to travel as far as rumours, and why local language and familiar idioms matter so much.
Different places, Different Tricks
The content of misinformation varied by context. In rural areas, it centred on pensions, subsidies or voting dates. In migration hubs, it was visas. In cities, it took the form of bank frauds or manipulated videos.
“Misinformation adapts to its environment,” Ami observed. “It flows into whatever cracks exist – poverty, aspiration or fear.”
Resilience of Leaders, Cost to citizens
During elections, the spread intensified. Doctored clips and false promises circulated faster than official manifestos.
Ami had a vantage point into the global dimensions of the problem while serving on Facebook’s Safety Advisory Board during the Cambridge Analytica scandal. There he saw how politicians handled misinformation as business as usual.
“Leaders develop thick skins,” he said. “For them, it’s part of life. But citizens are far more vulnerable. A false message about subsidy eligibility or a change in polling dates can mean exclusion from a basic right.”
Political communication research makes a similar point: elites can absorb or deflect attacks on their public image, but misinformation aimed at voters can carry immediate consequences – from missed benefits to disenfranchisement.
Lessons from the Ground
Reflecting on these experiences, Ami draws three lessons.
“First, misinformation adapts to local realities, whether it is pensions in villages or visas in migration hubs. Second, facts rarely travel as far as feelings; accessibility matters as much as accuracy. And third, trust in India flows through relationships. A forward from a neighbour carries more weight than a press release from Delhi.”
Policy literature on India’s state capacity echoes this: top-down interventions often falter unless they are translated into local idioms and delivered through trusted intermediaries.
From the Village to Global Platforms
The patterns Ami observed in Indian villages echoed what he later saw on large platforms. While global debates focused on data misuse and algorithms, his work in schools and communities showed a simpler truth: technology doesn’t create new problems, it makes old ones worse.
“The global and the local are connected,” he said. “Whether it is a fake visa message in Kerala or targeted ads in the UK, the techniques are similar – they exploit trust and emotion.”
Scholars of digital governance argue the same: mechanisms of virality – emotion, repetition, targeting – operate across scales, from small WhatsApp groups to global campaigns influencing elections.
Building Practical Responses
Today, Ami is working to turn those lessons into scalable tools by building safety frameworks that integrate technology with cultural understanding. Even so, his own orientation remains grounded in practice rather than industry positioning.
“My approach has always been shaped by the non-profit world,” he said. “Fighting misinformation is not about writing reports. It is about what can be done tomorrow morning – in schools, in communities, in conversations.”
Through Social & Media Matters and Contrails.ai, he has contributed to projects such as a study on caste-based hate speech in India, where Contrails served as the technical partner, and the development of “Itisaar”, a deepfake analysis tool demonstrated with academic researchers. These examples show that his approach blends grassroots awareness with building technical infrastructure.
An Uneven Struggle and Next Steps
The effort, Ami admits, is rarely linear.
“Some days we make progress, others we fall behind,” he said. “But the work has to continue, because misinformation never sleeps – and neither can those of us who are trying to counter it.”
For policymakers, Ami stresses that the challenge is to act without overreach. In India, past legislative efforts have often slipped into censorship, risking freedom of expression. He sees more promise in a two-track approach: stronger regulation of high-risk areas such as financial scams and health misinformation, and long-term investment in digital literacy and education through schools and state agencies. In his view, durable solutions require institutions, platforms, and communities to work together.
View as PDF
Ami Kumar is co-founder of Contrails.ai. All the details are based on his account and have been approved for publication. This piece was prepared with assistance from Ms. Sapna Singh, a member of the editorial team at The Policy Edge.