Mozilla Urges WhatsApp to Combat Misinformation Ahead of Global Elections

In 2024, about half of the world’s population will vote in elections in 64 countries, including major democracies such as the United States and India. Social media companies like , and , have promised to protect the integrity of this election, at least when it comes to the discourse and factual claims on their platforms. However, missing from the discussion is the closed messaging app WhatsApp, which now competes with public social media platforms in terms of scope and reach. This absence worries researchers at the nonprofit Mozilla.

“Nearly 90% of the security measures promised by Meta in the run-up to this election are focused on Facebook and Instagram,” Odanga Madung, a senior researcher at Mozilla who focuses on elections and platform integrity, told Engadget. “Why hasn’t Meta publicly committed to a public roadmap on how exactly it will protect its country’s elections? [WhatsApp]?”

Over the past decade, WhatsApp, which Meta (then Facebook) bought for $19 billion in 2014, has become the means of communication for most of the world outside the United States. In 2020, WhatsApp announced that it had more than two billion users worldwide – a scale that dwarfs any other social or messaging app except Facebook itself.

Despite this scale, Meta’s focus has largely been solely on Facebook when it comes to election-related security measures. Mozilla found that while Facebook has made 95 election-related political announcements since 2016, the social network came under scrutiny that year for supporting and promoting extreme political sentiment. WhatsApp only had 14. For comparison, Google and YouTube made 35 and 27 announcements each, while X and TikTok had 34 and 21 announcements, respectively. “From what we can tell from his public announcements, Meta appears to be overwhelmingly prioritizing Facebook in his election efforts,” Madung wrote in the report.

Mozilla is on meta to make significant changes to how WhatsApp works on election days and in the months before and after a country’s elections. These include adding disinformation labels to viral content (“Heavily forwarded: Please review” instead of the current “Forwarded multiple times”), limiting broadcast and community features that allow users to send messages to hundreds of people at once, and encouraging people to “pause and reflect” before forwarding anything. More than 16,000 people have signed Mozilla’s pledge calling on WhatsApp to slow the spread of political disinformation, a company spokesperson told Engadget.

WhatsApp first encountered problems with its service after dozens of people were killed in India, the company’s largest market, sparked by misinformation that went viral on the platform. This included limiting the number of people and groups users could forward a piece of content to, and distinguishing forwarded messages using “forwarded” labels. Adding a “forwarded” label was a measure to curb misinformation – the idea was that people might view forwarded content with greater skepticism.

“Someone in Kenya, Nigeria or India using WhatsApp for the first time will not think about the meaning of the term ‘forwarded’ in the context of misinformation,” Madung said. “In fact, it could have the opposite effect – that something has been widely disseminated and therefore must be credible. For many communities, social proof is an important factor in determining the credibility of something.”

The idea of ​​asking people to stop and think came from a feature on Twitter where the app asked people to actually read an article before retweeting it if they hadn’t opened it first. Twitter reported that the prompt led to a 40 percent increase in the number of people opening articles before retweeting them

And the call for WhatsApp to temporarily disable its broadcast and community features arose from concerns about the ability to send messages, whether forwarded or not, to thousands of people at the same time. “They’re trying to make this the next big social media platform,” Madung said. “However, without taking into account the introduction of security functions.”

“WhatsApp is one of the few technology companies that intentionally limits sharing by implementing forwarding limits and flagging messages that have already been forwarded multiple times,” a WhatsApp spokesperson told Engadget. “We have developed new tools to enable users to find accurate information while protecting them from unwanted contact, which we will discuss in more detail.”

Mozilla’s claims arose in connection with election programs and elections that the company conducted in Brazil, India and Liberia. The former are two of the largest WhatsApp markets, while the majority of Liberia’s population lives in rural areas with low internet penetration, making traditional online fact-checking nearly impossible. In all three countries, Mozilla found that political parties made heavy use of WhatsApp’s broadcast feature to “micro-target” voters with propaganda and, in some cases, hate speech.

WhatsApp’s encryption also makes it impossible for researchers to monitor what’s circulating in the platform’s ecosystem – a limitation that hasn’t stopped some of them from trying. In 2022, two Rutgers professors, Kiran Garimella and Simon Chandrachud, visited the offices of political parties in India and convinced officials to add them to 500 WhatsApp groups they ran. The data they collected formed the basis of a report they wrote titled “What’s Going Around on Partisan WhatsApp in India?” Although the results were surprising – Garimella and Chandrachud found that misinformation and hate speech actually did not make up the majority of the content there groups – the authors made clear that their sample size was small and they may have intentionally excluded groups where hate speech and political misinformation circulated freely.

“Encryption is a diversionary tactic to prevent accountability on the platform,” Madung said. “In the electoral context, the problems are not necessarily just the content. It’s about the fact that a small group of people can easily end up having a huge influence on other groups of people. These apps have removed the friction from society’s transmission of information.”

This article contains affiliate links; If you click on such a link and make a purchase, we may receive a commission.

Sharing Is Caring:

Leave a Comment