Facebook's Bad Idea: Crowdsourced Ratings Work For Toasters, But Not News

Opinion: User reviews might work when you’re shopping for a phone charger on Amazon, but crowdsourcing “news” was what got us into this mess in the first place.

Mark Zuckerberg recently announced that Facebook will ask its users to rate the credibility of news publishers, as a way to reduce the amount of fake news circulating on its platform. That’s a bad idea, because while crowdsourced user reviews might work when you’re shopping for a phone charger on Amazon, crowdsourcing “news” is what got us into this mess in the first place.

We’ve done extensive work on the fake news problem, and on the issue of rating news sources on Facebook. Our research includes a paper entitled, “Behind the Stars: The Effects of News Source Ratings on Fake News in Social Media,” which deals with this exact issue.

There are two big problems with Facebook’s plan.

First, our research shows users don’t trust other users. Populism and cynicism about experts may be in vogue, but our research tells us that in the real world, people trust expert ratings of news sources more than they trust ratings from other users. We studied 590 Facebook users in the United States from a wide variety of ages and education levels, and they believed that expert ratings would be more credible than user ratings because experts are more likely to be objective and check the facts.

Even when we assured them that the system would ensure ratings weren’t manipulated by bad actors, they still didn’t budge. We found that reviews by Facebook users, on the other hand, are more likely to be driven by emotion and the extent to which the source’s opinions match their own.

Consumer reviews of products like toasters work because we have direct experience using them. Consumer reviews of news sources don’t work because we can’t personally verify the facts from direct experience; instead, our opinions of news are driven by strong emotional attachments to underlying sociopolitical issues. Put simply, our research shows that we’ll trust anyone to be objective about their kitchen appliances, but when it comes to news, we want experts who can verify the facts.

Second, user ratings are easily manipulated. We rely on online reviews, but research shows that 15-20% of online reviews are fake. Fake reviews are more common on websites that don’t verify whether the user has actually used the product or service. Zuckerberg said that Facebook would only accept ratings from users who say they are familiar with the news sources they are judging, but the honor system, while logical, won’t stop fake reviews.

Would you rate the credibility of RT.com for Facebook? Unless you’re a news junkie, you probably haven’t heard of RT, the Russian propaganda site that posted multiple pro-Trump and anti-Clinton “news” stories in 2016. So, you wouldn’t rate it. But avid fans of RT will rate it enthusiastically, which means it will likely get high ratings, as will many arcane alt-right or alt-left sites known only to users who voraciously consume their fringe content.

In contrast, the New York Times, NBC, and CBS are well-known media brands that strive to adhere to the best in journalist ethics. It’s still not clear exactly how Facebook’s new system will work, but from what we can tell, people will be asked to rate news sources they are “familiar” with. Almost everyone is familiar with these brands, yet surveys show them to be distrusted by about half of the people. Their Facebook rating could end up being lower than the rating for RT.com and fringe alt-right and alt-left sites, which are “familiar” to fewer people—many of whom may find them trustworthy because they match their own opinions.

Throw in MSNBC, Fox News, Breitbart and Occupy Democrats, and things get more interesting. Do you have an opinion on these, even though you haven’t studied their stories?

The impact of fake reviews, whether due to malfeasance or misinformation, is stronger when there are small numbers of reviews. But dedicated actors with specific agendas can manipulate ratings even when there are large numbers of them. Mix this with ratings from users who have never used the news source but have an opinion anyway, and this is a recipe for disaster.

Our research suggests that there is a better path forward: expert ratings of news sources. We can compile a historical rating by asking a wide range of experts from across the political spectrum to rate past stories produced by a news source. This expert rating can be attached to all news stories produced by the source the instant the stories appear. These ratings will be less biased than user ratings.

And users are more likely to trust the expert ratings. When source ratings influence users, stories from old trusted sources will gather users’ attention, while stories from untrusted or unrated sources will be viewed with suspicion.

Facebook thought about this approach, but rejected it for exactly the wrong reasons. “We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem,” Zuckerberg said in his announcement. It suggests a strange understanding of “the objectivity problem” within Facebook - or even of objectivity itself.

Our studies have pointed to a different way to see the world objectively: Don’t give the users a bunch of “he said, she said.” Tell them what the experts said.

The authors are affiliated with the Kelley School of Business, Indiana University. Alan Dennis is a Professor and holds the John T. Chambers Chair of Internet Systems, Antino Kim is an Assistant Professor, and Patricia Moravec is a Ph.D. student.

Topics in this article

Skip to footer