In early October, whistleblower Frances Haugen filed a complaint against Facebook to the US Securities and Exchange Commission (SEC). The Facebook Papers, as Haugen’s files are now known, reveal the worrisome scope of misinformation on the platform in India.
With 434 million active users, India is Facebook’s largest market. The USA makes up less than 10% of Facebook’s daily active users, with around 200 million people. And yet, a hugely disproportionate 87% of Facebook’s 2020 counter-misinformation budget wass relegated to America.
The whistleblower files reveal that Facebook has insufficient capacity to flag misinformation in Indian languages. This is especially damaging given the increasingly violent sectarian conflict gripping the country.
Of India’s twenty-two officially-recognised languages, Facebook has only trained its AI systems in five. But despite this, even in Hindi and Bengali (the most spoken Indian languages), it still lacks sufficient data to adequately police violent content, with aggressive Islamophobia going unreported and unaddressed.
Haugen’s complaint stated that Facebook takes action against just 3-5% of hate speech content, and against a mere 0.6% of violent and inciting content.
In early 2019, Indian Facebook staff constructed a dummy Facebook profile to test misinformation flows in the state of Kerala. The result was an “integrity nightmare”; a torrent of anti-Muslim hate speech and violence clogging the dummy user’s feed.
It’s just one of many studies and memos on the Indian Facebook experience. They all highlight Facebook doesn’t fully understand its impact on local culture and politics, nor does it make sufficient effort to counter issues that do occur.
Haugen’s letter to the SEC said Facebook had found 40% of sample views of civic posters in West Bengal to be fake. In one case, an inauthentic post was viewed more than 30 million times. 35% of users in the region are recommended groups that share false content to propagate political narratives.
Religiously-charged politics is the arena of choice for Indian Facebook misinformation. A 2021 study showed the persistency of politically-motivated Islamophobia. The report abounded with posts comparing Muslims to ‘pigs’ and ‘dogs’, and claiming the Quran calls for men to rape their female family members.
Much of this material circulated in groups promoting an Indian nationalist organisation with strong ties to India’s ruling BJP party.
This incredibly harmful political misinformation extends beyond the world’s largest democracy. Propelled by aggressive misinformation, Facebook, and especially its subsidiary Whatsapp, were used to organise the Myanmar ethnic cleansing, anti-Muslim violence in Sri Lanka, and massive misinformation in the 2018 Brazilian presidential election.
And unlike for most in the Western world, it’s extremely difficult for people living in less-developed countries to simply ‘opt out’, as Facebook is largely synonymous with the Internet.
Its role extends far beyond the ‘social’ aspect, into business networking and even medical communication. In countries where democracy is just a name, and local social media is controlled by the state, Facebook increasingly becomes a critical tool for political discourse and mobilisation.
As Nina Jankowicz says in her essay on Facebook’s centrality, “Despite Facebook and other social media platforms’ public remonstrations that they are changing their behavior, we could not be farther from that online democratic utopia.”
Cover photo by Alessandro Biascioli on iStock.
Follow Maddie’s journalism journey on Twitter.