“There are three sides to every story: your side, my side, and the truth. And no one is lying. Memories shared serve each differently.” This line from The Kid Stays in the Picture mirrors today’s political landscape, with social media platforms determining how we think, and trends influencing our take on culture itself. Take Elon Musk’s polarising takeover of Twitter and you’ll find those who champion his free-speech ideologies and others who are urging the world to get off the platform for good.
In light of the ongoing discourse about Twitter’s new policies, Musk teased and then shared what he called the ‘Twitter Files’ exposing the way the social media platform was run, and how the ideologies of those within the company impacted the content that was shared. Unlike Instagram and Facebook, Twitter is widely regarded as a news platform, with journalists, celebrities, and media personalities sharing their views on current events. Thus, the idea of heavy content moderation that’s influenced by political stances is just as horrifying (though not entirely surprising, given what we learned about Facebook last year) than, say, leaks regarding Instagram’s inability to protect teens from objectionable content.
While many are dismissing the significance of the Twitter Files, the leaks point to the need to democratise the platform, offering evidence in support of Musk’s take on the platform.
Twitter Files is a series of leaks based on countless e-mails and Slack chats led by Twitter employees, throwing light onto the content moderation policies exercised at the social media platform. On discovering these files, Musk commissioned three journalists – Matt Taibbi, Bari Weiss, and Michael Shellenberger – to go through the data and write about the same. They were also encouraged to post about it on Twitter itself in an effort to get more eyeballs on the issue.
One of the most significant findings was Twitter’s decision to block a New York Post story about Hunter Biden’s dealings in Ukraine. The story detailed instances of Hunter and Joe Biden using their political influence to coerce businessmen in Ukraine, prevent an investigation into their own business and liaise with influential people. On trying to post a link to this story, users would be shown a pop-up stating that the content they were posting was "potentially harmful”. Of course, the Hunter Biden laptop story has been doing the rounds for over 2 years – this was just a confirmation of what people had already been talking about.
While the events detailed in the article took place in 2016, they were exposed around the 2020 elections and could have impacted voting patterns in the United States. Many believe that the suppression of this information could have led to President Biden’s victory, and it’s hard to tell, in retrospect, the level of impact that this could have had on the same.
"I continue to believe there was no ill intent or hidden agendas, and everyone acted according to the best information we had at the time," said Jack Dorsey when asked to comment on Twitter’s suppression of this news. "Of course mistakes were made."
Similarly, the Twitter Files also revealed that certain people were placed on unofficial blacklists that limited the reach of their content, as well as their searchability. People were placed on ‘trends blacklists’, ‘search blacklists’, and ‘do not amplify’ based on their political leanings and content. This, itself, shows the platform’s indication to remain far from neutral and to expose users to very specific lines of thinking that would ultimately lead to echo chambers and groupthink.
The idea of content moderation is highly debated. On one hand, people believe that social media platforms do bear certain responsibilities and must exercise their power to protect their users. On the other hand, the very act of moderation leads to the introduction of biases, whether political, racial, or sociological, leading to widespread influence with immeasurable consequences.
Content moderation itself happens rather sporadically. While every social media platform has its own set of community guidelines, the truth is for every banned page, you’ll find countless others that violate the guidelines and are thriving. Take YouTube, for instance. The platform demonetises videos that have sexually explicit (or even suggestive) content, and for good reason. However, it also allows really creepy channels in India that post what can only be described as soft porn to garner millions of views and be monetised. Similarly, Instagram does not allow nudity on its platform, but we’ve all seen images that leave nothing to the imagination (along with accounts that seem to be run by porn stars) appearing in our feeds and story views. Twitter is also famously home to such accounts. So, what happened to content moderation here?
Ideally, if a platform allows children to create accounts and browse posts, it should be more stringent with its moderation policies regarding sexually explicit content. However, when it comes to content regarding the news, shouldn’t these platforms be neutral? While users will eventually end up in their own little echo chambers, shouldn’t it be up to them which echo chamber they want to be sucked into? And shouldn’t there be amplification of content from all sides of the political spectrum to help break these echo chambers and lead to more inclusive discourse?
Moderating news is a slippery slope, one that Twitter’s employees tried to navigate for their version of the ‘greater good’, failing those who expect freedom of speech in the process, as well as those who genuinely want to read varied takes on current events. Musk’s takeover of Twitter has already resulted in a few major upheavals, and will hopefully lead the platform down the path to more political neutrality.