In 2014, Facebook came out as a player in social-science research, and a controversial one at that. Its “emotional contagion” experiment, in which the company tweaked the feeds of 700,000 users and studied how it affected their moods, drew harsh criticism and put pressure on the company to apply more scrutiny to its research projects.
Now, after nearly two years of soul-searching, Facebook has revealed how it reviews and approves the experiments the company runs on users without them knowing about it.
In a new paper, called “Evolving the IRB: Building Robust Review for Industry Research,” company officials describe a process that loosely imitates the system used at universities, which convene institutional review boards, or IRBs, to evaluate research projects on their scientific and ethical merits.
At Facebook, which is constantly experimenting on its users, “expert” managers have to approve all research projects, according to the paper.
“Most of the research Facebook conducts relates to small product tests — for example, evaluating whether the size or placement of a comment box affects people’s engagement,” write the authors, Molly Jackman and Lauri Kanerva, officials in the company’s research division.
If a manager feels a proposal needs more scrutiny, that official can refer it to a five-person review panel, which considers the “potential ethical, policy, and legal implications.” All five members then have to agree in order for the research project to move forward.
The panel might seek outside experts to weigh in on certain research topics. (When Facebook researchers wanted to study how many people were announcing same-sex preferences on the site, for example, it sought the counsel of officials at LGBT groups.) The panel keeps records of its deliberations for reference, though it does not make those records public.
Michelle N. Meyer, a bioethicist who defended the “emotional contagion” experiment and has advised Facebook on its research practices, says those who want to know about how company officials arrived at their decisions in any particular case may find the paper to be frustratingly general.
“If I was peer-reviewing this paper,” says Ms. Meyer, an assistant professor and director of bioethics policy at Clarkson University and the Icahn School of Medicine at Mount Sinai, “and they were arguing that this process is a good process, I would reject it. I would say, Well, you have to tell me more before I can decide whether it’s a good process or not.”
But that is an unfair standard, she says. The deliberations of review boards at universities are similarly opaque, says Ms. Meyer, who compared them to juries.
“You shouldn’t hold private corporations like Facebook to a higher standard than academic IRBs, for sure,” she says. “And that standard which has been set by the academic IRB system, in terms of transparency, is not high.”