Facebook’s Emotion Study Causes Outrage and Raises Ethics Questions

Facebook emotional study
Manipulating and analyzing consumer data is the backbone of many web marketing companies, including Google, Yahoo and Facebook. But the latter company may have crossed what some see as an ethical line with a recent emotions study.

This month, Facebook revealed that it had manipulated the news feeds for over half a million random users during January 2012. These users either saw more positive or more negative posts than usual, and then the company tracked the spread of that information. The manipulated news feeds were part of a psychological study to examine the impact of emotions on social media sharing.

But you agreed to it in the Terms of Service. Facebook was quick to note that this type of manipulation was agreed to by users through the standard terms of service, but many are calling foul. Academic protocols generally dictate that psychological test subjects need to consent to the testing before it is conducted. Facebook did not ask for explicit permission for the test, but argues that its blanket consent in the terms of service covered their intrusion. Some Facebook users and critics saw it as crossing an ethical boundary.

“It’s completely unacceptable for the terms of service to force everybody on Facebook to participate in experiments,” said Kate Crawford, visiting professor at MIT’s Center for Civic Media and principal researcher at Microsoft Research. In an article in the Wall Street Journal, Crawford said that the experiment points to a broader problem in data science – the lack of ethics as part of the education of data scientists. In other words, they just aren’t taught that ethics play an important role in their field, nor do they always know how to integrate an ethical approach in their studies.

Research was published. The research was published in the March issues of the Proceedings of the National Academy of Sciences. The Facebook Data Science department attempted to test the noted effect of being sadder when seeing happy news from friends on the social network. The researchers found that the moods of people were contagious, but just barely so. People who were served more positive posts typically responded by posting positive messages, and negative posts had the same result – more negativity.

Facebook has a great deal of influence over what people see, even when there isn’t a psychological test underway. When a user logs in to service, there are about 1,500 different items that could be displayed based on the users’ friends and pages they have followed. Users see only about 300 of them in the feed at a time.

A secret formula. A secret algorithm is used to determine what exactly those 300 items are. It takes into account how often a user comments on a type of post, how much other people in a users’ circle of friends are talking about a post, and who users are most likely to interact with. Facebook also gathers direct feedback from users. On the desktop version of the platform, users can click an arrow on the top right corner to rate their satisfaction with that type of post. The changes in the newsfeed are an attempt by Facebook to give users more of what they want so they ultimately spend more time on the service.

Public apology downplays what was done. In response to the uproar, Facebook data scientist Adam Kramer, who led the study, publically apologized for the way the paper described the research on his personal Facebook wall. He said that phrasing in the study made the manipulation sound a lot more sinister than it was, and that all in all it had a very small impact on the lives of users. The actual impact on subjects just met the minimal statistical amount.

“In hindsight,” he noted, “the research benefits of the paper may have not justified all of this anxiety.” Users who were exposed to either the positive or negative feeds produced an average of one fewer emotional word per thousand words over the following week. Still, Kramer considers that the notion of feeling bad about positive postings was debunked.

More questions, damage. However, the impact on Facebook’s brand and questions about their ethics may be permanently damaged. Comments from Facebook users in response to Kramer’s statement made it clear that no matter what level of emotional manipulation used, manipulation is manipulation. Some called for Facebook to make donations to mental health organizations in order to “make up” for the violation.

Leave a Reply

Your email address will not be published. Required fields are marked *


Your browser is out of date. It has security vulnerabilities and may not display all features on this site and other sites.

Please update your browser using one of modern browsers (Google Chrome, Opera, Firefox, IE 10).