Dear Author

The Facebook Emotions Study Controversy

The emotion-manipulation study conducted by Facebook has drawn a lot of attention in the last week, much of it negative. Many DA readers are on Facebook (like so much of the online world), and many DA readers have children with Facebook accounts. Everyone is free to make up her own mind about how much it matters, but I’m a big believer in people making informed decisions, so I thought it might be helpful to have a more comprehensive post and discussion.

I’ll start by summarizing the main points. On June 2, 2014, two academics and a Facebook data scientist published a paper in the Proceedings of the National Academy of Sciences, which is the flagship journal of the most prestigious scholarly society in the United States. In this journal article, they reported the results of a study which attempted to manipulate the emotional content of Facebook News Feeds. More than 600,000 users were included in the study, with some being in treatment groups (their feeds were manipulated either positively or negatively, according to the study protocol) and some in control groups (their feeds were not).

The two requirements to be included were: (1) posting in the English language; and (2) having posted in the previous week. The study took place over seven days in January 2012.

US government-funded research that involves human subjects is required to conform to the Common Rule, which protects subjects from mistreatment and stipulates that they give informed consent before they participate. Informed consent is a technical requirement that places certain burdens of explanation and disclosure on the researcher. Not all studies can provide complete information to subjects before the experiment takes place because knowledge may alter behavior in ways that invalidate the study. In these cases, sometimes termed deceptive research, investigators are permitted to mislead subjects about the true aims of the study while they are preparing them for participation and in carrying out the data collection, but the subject must be debriefed after participation, ideally directly following participation but in all cases before the data are analyzed. Consequent to this debriefing, the subject has the right to have her data removed from the study.

Studies that do not receive federal funding are not required to conform to the Common Rule, but many private and public institutions (and some companies) voluntarily choose to meet these requirements.  Versions of the Common Rule are in force in many other countries but by no means all.

The data collection and analysis for this emotion study were conducted by the Facebook data scientist. The academic authors, according to the notes provided by PNAS, limited their participation to designing the research and writing up the paper (together with the Facebook data scientist). This means that the academics were effectively siloed from the human subject part of the researchi.e., they were separated from the sources of the data and the subsequent collection and analysis. This is important because had the academics interacted directly with the subjects or their data, they would have had to submit their proposed research and their methods of obtaining informed consent to their institutions’ IRBs (Internal Review Boards). Not all academic research involving human subjects requires informed consent (there are various conditions under which consent rules can be waived), but that is the IRB’s decision to make, not the researcher’s.

Facebook, whose data scientists gathered and analyzed the data, asserts that they received informed consent from their human subjects. Their justification is that all Facebook users agree to Facebook’s terms of service conditions when they set up their accounts. These conditions currently include assent to research for internal operations purposes, but they did not include such provisions in January 2012, when the experiment was conducted, and there is no equivalent to an IRB within the company even though they conduct considerable social science research, with and without academics’ collaboration.

The bottom line: three Ph.D. social scientists conducted experimental research on Facebook users. Facebook did not obtain informed consent from the users before the research was conducted. The academic researchers designed the study (together with the Facebook data scientist) but then absented themselves until it was time to write up the paper. They submitted the project to their own IRB at Cornell University, and the IRB deemed informed consent to be unnecessary because they were not involved in data collection or analysis. Cornell University released a statement asserting:

Because the research was conducted independently by Facebook and Professor Hancock had access only to results – and not to any data at any time – Cornell University’s Institutional Review Board concluded that he was not directly engaged in human research and that no review by the Cornell Human Research Protection Program was required.

This is not the first time Facebook and academics have joined forces to manipulate user behavior for non-commercial (or not solely commercial) purposes.  In 2010 they conducted a political experiment to increase voter turnout and claimed that the manipulation resulted in up to 340,000 more votes cast. That research was approved by the academic co-author’s IRB on the grounds that “minimal risk” was involved and therefore informed consent could be waived. Minimal risk is another technical category.

If this study is another “minimal risk” study, why are so many people, including many psychologists and social scientists, so outraged at what the three researchers have published? I can’t answer for them, but here are the reasons I find the research so problematic.

  1. We can’t know if the risk was minimal or not without knowing more about the population. We know that the median substantive effects were small, but we don’t know what the distribution of effects was, or what the outliers look like. Say 1% of the distribution (of emotion effect) lies at the upper end, with more than minuscule effects. That’s more than 6000 people.  In addition, the total population of Facebook is not necessarily a “normally distributed” population, emotionally speaking. Its users definitely comprise a skewed sample of the world (and world’s English-speaking) population, and we don’t know if and/or how it’s skewed on this particular trait.
  2. There were no pretests or pilot studies reported. If there was an effort made to see what the overall effect on more than 600,000 people might be before carrying out the main experiment, it wasn’t reported. Failing to pretest is analogous to ignoring the hair-coloring manufacturer’s advice to pretest on a patch of skin before dumping it all on. Most people don’t pretest, and most people don’t have a reaction. But for the ones that do, the usefulness of the test cannot be overstated. Where the analogy breaks down, of course, is that when I color my hair, I’m responsible for following or not following the advice; the study subjects had their choice made for them.
  3. The results are not about emotion directly; they are about changes in behavior which the investigators infer is based on changes in the emotions of interest. There are any number of unrelated-to-the-study reasons (omitted variable bias for you stat types) why users might have posted more or less. To assume that changes in posting behavior reflect changes in users’ emotional states is a leap that cannot be credibly sustained without more data. Therefore, we don’t actually know how much or how little the experimental manipulation affected emotions.
  4. Facebook and other online companies run experiments all the time, but most of these experiments are not about manipulating unmediated, general states of mind. A study like the 2010 get-out-the-vote study is about politics. Other studies are about advertising. In these cases, users enter the study with prior expectations about being manipulated by politicians and companies, and they subconsciously adjust for that manipulation when they engage with the material. But we don’t get on Facebook thinking “hey, sometimes Facebook just fucks with my feelings for science,” so we don’t protect ourselves against that. We may well be more vulnerable in a general manipulation than in a specific one.
  5. Academics do research on emotions all the time. It’s an important topic for us to understand. Frequently, emotions research is deceptive research, so we can’t tell subjects exactly what we’re doing before they participate in the experiment. But we make sure they know they’re being experimented upon, and we debrief them honestly afterward. There are protocols that cover just about every kind of research social scientists conduct. Those protocols can be used for 6 people or 600,000 people. Use them.
  6. If Facebook is going to fuck with our feelings for science, especially if they’re doing it using the imprimatur of academic legitimacy, they need to follow academic rules, especially if they’re going to reap the benefits of that legitimacy (and offering academic research and publishing opportunities has been part of their pitch to social scientists). Academics work with IRBs. IRBs require informed consent. Get informed consent or don’t do the research. Even now, Facebook’s TOS specifies that their research is for “internal operations.” There is nothing internal about an open-access article in a prestigious journal.
  7. This study’s sample selection failed to exclude minors. No one should be able to experiment on minors in any way without the explicit, informed consent of their parents, guardians, or other legal agents.

I’ll close by reiterating what I said on Twitter and at my personal blog when I first heard about this study, before we had the whole story, but which still sums up my feelings as an academic and as someone who spends a lot of time and emotion online:

The history of human subject research is full of horrible examples of abuse and exploitation. As an empirical social scientist, I know how lucky I am that people are willing to spend their time, effort, and emotions, and even risk their reputations (however hard we try to anonymize their identities) to increase human knowledge. Their generosity is a gift that should be honored and respected.

Facebook just told us how badly they fucked with that gift.

I’m happy to answer specific questions about the study in the comments, so feel free to ask. And I apologize for my language. I guess I’m still angry.