Facebook faces flak for experimenting with users’ emotions

Facebook faces flak for experimenting with users’ emotions
HIGHLIGHTS

Reports suggest that the social network had conducted a psychology experiment on almost 700,000 users without their knowledge and consent.

World's most popular social network, Facebook, is facing strong criticism for treating its users like lab rats after it was revealed that the company had conducted a psychological experiment over thousands of its users. The latter had no knowledge of the experiment. Reports say that Facebook manipulated news feeds of the said users to control which emotional expressions the users were exposed to.

Facebook conducted the study in collaboration with Cornell University and University of California. Adam D.I. Kramer, the Facebook researcher who led the study, has posted a public apology on his Facebook page. He says, "Having written and designed this experiment myself, I can tell you that our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety."

Even though Kramer says that the goal of the research was to learn how to provide a better service, many analysts find it hard to understand how such a study could improve social networking services. "Facebook didn't do anything illegal, but they didn't do right by their customers," said Brian Blau, a technology analyst with Gartner, a research firm. "Doing psychological testing on people crosses the line." 

Technology Systems & Policy Analyst, Lauren Weinstein tweeted, "Facebook secretly experiments on users to try make them sad. What could go wrong?" Many people now believe that Facebook is manipulating material from people's personal lives for improving its business model but that has already been known about top internet companies like Facebook including Google, Yahoo etc, for quite a while now. In another tweet, Weinstein says, "I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it's possible."

Facebook on the other hand completely justifies the experiment by saying that users consent to this kind of manipulation when they agree to its terms of service. Even though Facebook hasn't broken any laws (including virtual laws), many believe that such experiments are unethical especially when the consent of its users is not taken into consideration. Academic research protocols generally stress on having people's consent before conducting any psychological study on them.

Even though the experiment has created uproar against Facebook, it must be noted that the experiment was done two years ago, according to BBC. The news of the experiment was revealed only after the researchers published the paper recently. The paper states that the study found that users who had fever negative stories in their news feed were less likely to write a negative post, and vice versa. Facebook always tries to program its algorithm in ways through which they can keep their users engaged. This way, users get more prone to seeing ads that the company is always trying to sell for its revenue.

Author of the book, Terms of Service: Social Media, Surveillance, and the Price of Constant Connection, Jacob Silverman speaking to The Wire magazine said, "What's disturbing about how Facebook went about this, though, is that they essentially manipulated the sentiments of hundreds of thousands of users without asking permission," he said. "Facebook cares most about two things: engagement and advertising. If Facebook, say, decides that filtering out negative posts helps keep people happy and clicking, there's little reason to think that they won't do just that. As long as the platform remains such an important gatekeeper – and their algorithms utterly opaque – we should be wary about the amount of power and trust we delegate to it."

Facebook conducted this experiment for a week in January 2012. The data scientists involved in the research gathered data from almost 700,000 Facebook users after they logged in. Some of these users were shown content with more happy and positive words while the rest were shown sadder than average content. The posts that these users posted after witnessing such content was logged and analyzed for a study on "emotional contagion" and recently released in a paper.

Mir Ubaid
Digit.in
Logo
Digit.in
Logo