Academics Question The Value Of Facebook's Controversial Research

"I don't know that the benefits we're gaining from this research are all that significant."

Facebook and its data science team came under fire this weekend after a recently published study revealed that Facebook had changed content in 600,000-plus users' News Feeds for a psychological experiment focused on users' emotional states.

But while much of the focus has been on the ethics of the study, some psychologists are also taking issue with the experiment's methodology and results.

The study found that by altering the News Feeds of 689,003 Facebook users users, it could change the mood of the status those users posted on the network. Users with more positive Feeds led to more positive status updates and similarly, feeds that had more negative content led to more negative posts.

Dr. John Grohol, founder of the psychology site, Psych Central said he sees there two major flaws in the study, starting with the use of its sentiment analysis tool, Linguistic Inquiry and Word Count application (LIWC 2007). It's a software program linguists and others psychologist commonly use in their research and it's a well-understood tool that's been pretty widely use but it was never designed to be used for small bits of text.

"Even in talking to the company that makes the tool, they acknowledge it cannot really differentiate 140 characters or a few short sentences from a larger body and it can't do a good job determining tone and content of the message," Grohol says. "Essentially what LIWC 2007 is doing is giving researchers data to analyze and that data is biased in a certain direction and we don't' know how it's biased because the authors don't look at LIWC 2007 as though it has any limitations. And we know that's just not the way the tool works."

Furthermore, Grohol said, the study, while focused on exploring emotional contagion, doesn't actually measure the moods it's trying to capture. "They never went to Facebook users and had them fill out a mood questionnaire. Instead the authors were making strange judgement calls based on content of status updates to predict a user mood," he says, noting that the authors would likely need some other tool or survey to accurately gauge something as complex as emotional state.

Tal Yarkoni, a psychology Research Associate at the University of Texas at Austin, echoed a similar concern in a blog post largely defending the Facebook study:



The fact that users in the experimental conditions produced content with very slightly more positive or negative emotional content doesn't mean that those users actually felt any differently. It's entirely possible–and I would argue, even probable–that much of the effect was driven by changes in the expression of ideas or feelings that were already on users' minds.

Lastly, both Grohol and Yarkoni argue that, even if the methodology produced accurate results, the findings have very little actual value for users. According to Grohol, the study found only a 0.07% (1/15th of one percent) decrease in negative words in people's status updates when the Facebook decreased the amount of negative posts in the users' News Feed, hardly enough to move the dials.

"I don't know that the benefits we're gaining from this research are all that significant," Grohol told BuzzFeed. "The correlations were so tiny that they're meaningless on an individual level. It's not really the kind of research that's significant in the way they'd have us to believe. For example, there's no real proof that if a bunch of your friends post something negative, you'll then post something negative. At least not at this small scale. Perhaps, if you read thousands and thousands of posts, you might."

Skip to footer