Facebook’s ‘Social Experiment’ Triggers Public Outcry

How are you feeling? Facebook wants to know. Wait, scratch that -- Facebook may already know. A just-published report about a 2012 study suddenly has the Internet buzzing with concern, once again, over Facebook policies.

This week's debate relates to the secretive social experiment Facebook conducted on a random selection of 689,003 of its one billion-plus users. According to an article published June 17, 2014, in the Proceedings of the National Academy of Sciences, researchers from Facebook and Cornell University were testing whether certain emotions could be manipulated and would then spread among people without face-to-face contact.

As part of the experiment, the number of positive and negative comments that Facebook users saw on their feeds of articles and photos was artificially altered without their knowledge in January 2012. In the end, the researchers found that users who were shown fewer positive words were found to write more negative posts, and those who were exposed to fewer negative terms ultimately shared more positive posts.

You Said We Could

The authors of the study were able to conduct the research because, they said, automated testing EUwas consistent with FacebookEUs Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.EU

But many are saying that the problem with gaining user consent this way is in the cursory look users give the privacy policies of Web sites, if they look at all. Even users who take the time to read FacebookEUs user agreement might not understand what theyEUre signing up for, according to Susan Etlinger, an industry analyst with the Silicon Valley-based Altimeter Group.

EUFacebook has been making changes to its timeline algorithm since it began, and will continue to do so,EU says Etlinger. EUWhat makes this different from other areas that are covered in FacebookEUs terms and...

Comments are closed.