What do Civil Unrest in Myanmar, Ethiopia, India and the US have in Common? Facebook.
As I write this, we live in a world where political polarisation has, for the most part, become a bit of a buzzword. Finding a common ground between two schools of thought, particularly online, is an increasingly uphill task, and this is what has led to an overwhelming sense of civil mistrust across all social strata. While India’s inherent diversity has always left the path to polarisation wide open, things have gotten remarkably worse over the last decade or so. Cue 2021, economic stagnation, increased job loss, rampant healthcare concerns – a cauldron of resentment with the perfect fuel to stoke the flames – Facebook.
After Facebook’s internal studies were leaked to The Wall Street Journal, the social media platform’s complicity with harming teen girls’ psyche was thrown into the light. However, little did we know, this was just the tip of the iceberg. Frances Haugen, now referred to as the Facebook Whistleblower, appeared before the US Senate on Tuesday to talk about her learnings when she was an employee at the company. These learnings were not circumstantial and were based on Facebook’s own internal studies.
Facebook Fuels Global Political Polarisation
One of the most explosive revelations to come out of the US Senate meeting was Facebook’s role in fueling riots and hate across the globe. By putting “profits before people”, the tech giant elected to turn a blind eye to hate speech as well as the way the social media platform is used to rustle up willing participants in civil movements with varying levels of violence.
Facebook’s data indicated that content that fuels hate and anger receives the highest engagement numbers. In other words, you’re more likely to stay online on Facebook while reading and “researching” hate-enticing content than you are when you’re scrolling through other forms of content. Longer hours spent on Facebook translates to a higher profit margin for the Trillion Dollar company, and vice versa.
According to Haugen, it is this type of content, which often slips through the cracks in Facebook’s scrutiny, that has led to some of the most notable acts of widespread violence across the globe.
Rohingya Crisis in Myanmar
Facebook’s role in the Rohingya crisis in Myanmar has been documented by experts, and there’s no room for doubt that the content pushed by the platform’s algorithm led to the violence in Myanmar. Alan Davis, an analyst from the Institute for War and Peace Reporting, confirmed that hate speech on Facebook in Myanmar was becoming more organised and militarised in the months before the violence erupted.
Fake news spread rapidly during the time, with fabricated stories designed to inspire action doing the rounds. It was not uncommon to come across stories that reported “mosques in Yangon are stockpiling weapons in an attempt to blow up various Buddhist pagodas and Shwedagon pagoda”. In a country where the majority of internet users rely on Facebook for news, this turned out to be catastrophic. Militant pages systematically spread what is now deemed to be misinformation until 650,000 refugees from the country were forced to flee to Bangladesh.
Gaining access to hateful content on Facebook is not hard, as demonstrated by a team of researchers who were trying to understand exactly what happened in Myanmar. The team started off by liking a Myanmar military fan page that did not violate any of Facebook’s rules. Based on this, the algorithm created a content consumption funnel for the researchers, exposing them to content that glorified violence.
“We didn’t have to look hard to find this content; FB’s algorithm led us to it,” said Rosie Sharpe, a digital researcher who worked on the report. “Of the first five pages they recommended, three of them contained content that broke FB’s rules by, for example, inciting or glorifying violence.”
Armed Conflict in Ethiopia
Myanmar is not the only country bearing the brunt of Facebook’s profit margins. Recent reports of armed conflict between the federal government and the Tigray People’s Liberation Front (TPLF) indicate that most of the conflict was fueled by content shared on Facebook.
In 2019, after fake news was shared on Facebook, violence erupted in the Oromia region, leaving 89 dead. While Facebook admitted that it could have done more to prevent the violence in Myanmar, nothing is being done to control the misinformation being spread online in Ethiopia.
When asked whether Facebook is aware of terrorist groups using the platform to spread messages all over the world, Haugen said that the company is highly aware of these pages, and does nothing to stop them.
Delhi Riots in 2020
During the US Senate meeting on Tuesday, Frances Haugen also talked about Facebook’s role in inciting violence in India. She said that Facebook is aware of the hate-mongering content linked to RSS pages, shared widely in India to promote mistrust and acts of violence. However, the social media platform, as is evidenced already, chooses to turn a blind eye as this content keeps people online longer. In fact, she also revealed that Facebook has classified India as a tier-0 country and that the company pays more attention to the country during election cycles because that’s when their profit skyrockets.
Additionally, to understand the role of politically-charged content in shaping social behaviours in India, Facebook conducted an internal study titled ‘Effects of Politician Shared Misinformation’ and learned that “out-of-context” videos shared by politicians in India led to widespread anti-muslim and anti-Pakistan sentiments across the country. Let’s not forget that Facebook has over 410 million users in India, who, like many across the world, rely on the social media platform to gain access to news. When news is often skewed, public sentiment becomes easier to mould, and this eventually shapes the actions taken by citizens.
Sounds familiar?
The 2020 Delhi Riots that occurred in the North-Eastern part of the city left over 1000 dead at the end of three days, with a majority of the victims being Muslim. While examining the aftermath of the 2020 Delhi Riots, the Supreme Court of India said, earlier this year, that Facebook’s role in inciting the violence cannot be ignored.
“Facebook today has influence over 1/3rd of the population of this planet! In India, Facebook claims to be the most popular social media with 270 million registered users. The width of such access cannot be without responsibility as these platforms have become power centres themselves, having the ability to influence vast sections of opinions,” the court said.
Similarly, pages that are deemed “anti-India” and “anti-Hindu” are also left fully operational on Facebook, widening the gap between the two religions and leading to more social unrest across all political spectrums.
What needs to change?
As demonstrated time and time again, vast political polarisation causes civil unrest that often bleeds into riots and acts of violence against specific minority groups. With social media platforms being able to sway mass perception and impact human behaviour to such a high degree, it is now imperative for governments across the world to work on legislation that limits the power of these platforms.
Stricter Laws
Facebook often oscillates between shrugging off responsibility for the content on its platform and banning certain pages that violate their safety norms (while leaving their content online, ready for consumption anyway, as in the case of the Sanatan Sanstha). With an eye fixed firmly on the bottom line, Facebook has proven to be an unreliable, and often reluctant, moderator. This means that it is on the governments to determine how these social platforms work, and the degree to which the algorithm influences one’s social feed. Simply liking a politically adjacent page should not lead one down a rabbit hole of hate and propaganda via suggested content.
Restricting Children on Social Media
In an earlier piece, we discussed how Instagram impacts teen health, leading to widespread depression, anxiety and eating disorders. Ensuring the safety of children online must now fall on the government, as opposed to these social platforms, by introducing laws that restrict minors from accessing harmful content on social media, and from posting content until they reach a certain age.
Does this impact free speech? Yes and no. Children posting suggestive images that lead to objectification should certainly not fall under the “we have the right to free speech” argument, as the content itself has voyeuristic notes that put the child in danger. At the same time, it is also important to recognise that social sharing, when happening within a safer context, can have positive impacts on a child’s self-esteem and development. However, since Facebook has not elected to create that safe space so far, it feels unlikely that it will happen anytime soon.
Lower Reliance on the Algorithm
One of the biggest perils of the content algorithm is that it creates echo chambers. You may be reading ten different iterations of the news, however, based on your political leanings and engagement patterns, Facebook will still only show you content that falls within your worldview. This creates what people are now referring to as an echo chamber, where your own beliefs are constantly echoed back to you, without any room for exposure to other points of view.
When you are continually exposed to the same school of thought (with increasing tones of violence over time), it firmly places you on one end of the political spectrum, with your actions in the future determined largely by the algorithm itself. We are what we consume, after all.
Decreasing the reliance on the algorithm and turning social media feeds into a place where one can simply look at members of our peer group and their updates, rather than a space where news is shared, can prove to be a solid step in decreasing the spread of fake news and public incitement.
Unfortunately, tackling social media and its impacts is not as easy as it seems as we are still learning about these impacts. As Frances Haugen stated, no one outside of Facebook actually knows how Facebook works. By demanding that these companies release accurate data and studies, governments can begin to chart out the way ahead to build nations that are more united and stronger.
Also Read: Instagram Ready To Do The Bare Minimum to Protect Teen Health
Kajoli Anand Puri
Kajoli is a tech-enthusiast with a soft-spot for smart kitchen and home appliances. She loves exploring gadgets and gizmos that are designed to make life simpler, but also secretly fears a world run by AI. Oh wait, we’re already there. View Full Profile