Ithaca, N.Y. – Researchers affiliated with Cornell University have participated in a controversial study that intentionally manipulated the news feeds of almost 700,000 Facebook users, according to Slate.com.
The study, which was published in the Proceedings of the National Academy of Sciences, was coauthored by Dr. Jeffrey T. Hancock, professor of communications and information science at Cornell.
The study tested whether changing the number of positive or negative posts a person sees on their Facebook newsfeed would affect how much positive or negative content they would post about themselves, according to the study.
In order to conduct this study, the researchers tweaked Facebook’s news feed algorithm so certain users would see more positive posts, while others would see more negative ones. The researchers did this without contacting the users or informing them that they were being subjected to an experiment, according to Slate.com.
Ah, informed consent. Here is the only mention of “informed consent” in the paper: The research “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”
That is not how most social scientists define informed consent.
Here is the relevant section of Facebook’s data use policy: “For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you … for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”
So there is a vague mention of “research” in the fine print that one agrees to by signing up for Facebook. As bioethicist Arthur Caplan told me, however, it is worth asking whether this lawyerly disclosure is really sufficient to warn people that “their Facebook accounts may be fair game for every social scientist on the planet.”