To Stop Spread Of Fake News, Facebook Drastically Limits WhatsApp

Janéy Tate
Written by Janéy Tate

The popular Facebook-owned messaging app, WhatsApp, has imposed restrictions after the spread of fake news was blamed for causing mob killings of more than 20 people in India over the past two months.

WhatsApp, which is widely used outside the U.S., will limit how many recipients can be added in a user’s forwarded group message.

When Facebook announced in February 2014 that it would pay $19 billion to acquire WhatsApp, most Americans had never used or heard of the popular messaging app.

“Today, we’re launching a test to limit forwarding that will apply to everyone using WhatsApp,” the company said said in a July 19 WhatsApp blog. “In India—where people forward more messages, photos, and videos than any other country in the world—we’ll also test a lower limit of 5 chats at once.”

WhatsApp users outside India will not be able to send more than 20 forwarded messages at one time. This is the first time that WhatsApp has put restrictions on forwarding on its platform in India and globally, according to the Wall Street Journal.

To limit how users in India can send information, WhatsApp has removed the “quick forward” button in the app.

Before the changes were in effect, users could easily send 250 recipients a message at one time, reported.

Facebook wants to prevent the viral spread of fake news over its WhatApp platform and believes the new changes will aid them in their efforts.

“We’re horrified by the violence in India, and we’ve announced a number of different product changes to help address these issues,” a company spokesperson told the news site Recode. “It’s a challenge which requires an action by civil society, government and tech companies.”

However, the Indian government doesn’t think the restrictions are enough. There’s talk of suing the messaging app, according to a statement posted on TheNews.Com.

The spread of rumors and false messages about child abductors on WhatsApp led to mass beatings and killings this year in India, sparking calls for action from authorities, Reuters reported.

“Rampant circulation of irresponsible messages in large volumes on their platform have not been addressed adequately by WhatsApp,” authorities said. “When rumors and fake news get propagated by mischief-mongers, the medium used for such propagation cannot evade responsibility and accountability.

“If (WhatsApp) remain mute spectators, they are liable to be treated as abettors and thereafter face consequent legal action.”

Although the restrictions aren’t foolproof, Facebook hopes they will help end the spread of misinformation. Many WhatsApp users use the app to communicate daily with their international family and friends.

Controlling the spread of fake news won’t stop at WhatsApp though. Facebook says it plans to restrict content on the world’s most popular social media site that it believes contributes to acts of violence.

“There are certain forms of misinformation that contribute to physical harm, and we are making a policy change which will enable us to take that type of content down,” a Facebook spokesperson told Digital News Daily.

This is not the first time Facebook has come under fire for the way content is used and dispersed. Facebook CEO Mark Zuckerberg testified earlier this year before Congress about political, digital, and privacy abuses enabled by Facebook. Much of the controversy centers about the social media giant’s dealings with Cambridge Analytica, a data and political consulting firm that worked extensively on the president’s 2016 campaign. Cambridge Analytica improperly accessed the private information of some 87 million Facebook users.

(AP Photo/Pablo Martinez Monsivais) Image: Anita Sanikop

On July 20, Facebook Tweeted that limiting messages will allow the app to be what it was intended to be—a private messaging app.

The new WhatsApp limits prompted people on social media to voice their support and concerns with Facebook and WhatsApp.