Interesting Technology

Facebook manipulated feeds of users to show only positive or negative posts to study emotional responses

FACEBOOK deliberately manipulated the feeds of almost 700,000 users to see how negative or positive posts affected their moods.

In a move that’s raised some ethical questions, Facebook tweaked the algorithm that delivers news into users’ feeds using a program to analyse whether posts contained positive or negative words.

Some users were delivered only positive posts in their news feeds while others saw overwhelmingly negative posts.

Researchers Adam Kramer, of Facebook; Jamie Guillory, of the University of California, San Francisco; and Jeffrey Hancock, of Cornell University then set out to study “emotional contagion through social networks”, to see if positive feeds led to positive posts from users and vice versa.

The result is that yes, it did.

“When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred,” the researchers wrote in their paper for the Proceedings of the National Academy of Sciences.

“These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks.”

The research was carried out during the week of January 11—18, 2012, with 689,003 users unwittingly participating in the experiment, when positive or negative posts had between a 10 per cent and 90 per cent of being removed from their newsfeeds.

“It is important to note that this content was always available by viewing a friend’s content directly by going to that friend’s “wall” or “timeline,” rather than via the News Feed,” the study authors wrote.

“Further, the omitted content may have appeared on prior or subsequent views of the News Feed. Finally, the experiment did not affect any direct messages sent from one user to another.”

But the experiment is causing a stir among some ethicists.

“If you are exposing people to something that causes changes in psychological status, that’s experimentation,” James Grimmelmann, a professor of technology and the law at the University of Maryland told Slate.

“This is the kind of thing that would require informed consent.”

But Facebook insists it had the consent of users as the research “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”

Facebook’s Data Use Policy states that the company “may use the information we receive about you … for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

source : Heraldsun