Fake News Will Go Viral Even If People Don't Mean To Share It, Says This Study

    We simply don't have time to distinguish between the good stuff and the bad, according to a new model of how we share things online. (Update: This study has been retracted.)

    Update, 1/9/19: This study has been retracted.

    Even if people want to read and share real news, the sheer flood of information coming from social media means they will struggle to do so, according to a new study.

    The paper, published in the journal Nature Human Behaviour, looked at how people use social media – how many friends they have, how many items they share, how much time they spend reading – and used it to build a simplified model of a social network.

    The authors assumed that, all else being equal, people would prefer to read real news (what they called "high-quality information") over fake news ("low-quality information"). But their model found that even if that's true – and even assuming that people can instantly tell what's higher-quality and what's lower-quality – there is very little link between the quality of a news item and how viral it goes.

    "We were trying to look at the factors that make it difficult for good-quality information, such as real news, to surface and attract attention – as opposed to misinformation and fake news," Filippo Menczer, a professor of informatics and computer science at Indiana University and one of the authors of the study, told BuzzFeed News. "We assumed people could tell the difference, and would prefer to share good quality. Is it the case that high quality will actually win?"

    The answer is that it won't, or at least not by much. The study found that "quality and popularity of information are weakly correlated" – that is, how true something is has only a slight effect on how viral it goes.

    That's because people only have so much attention. "On average people have 200 friends on Facebook," says Menczer. "If everyone only shares a few things, unless you're on social media all day, you'll only see a bit of what they share." So while you can tell which of the things you see are the most trustworthy, you simply don't see the vast majority of stuff.

    "Maybe you only see five things," says Menczer. "And you share one because it's interesting, but not the others. But if you'd looked at 20, maybe you'd have seen something that was more trustworthy, or something that told you that story was fake." This combination of limited attention span and huge volumes of information makes it far harder to tell fake from real.

    And quality is not the only thing that affects whether you'll share something – you're also more likely to share something that's popular (and therefore overrepresented in your feed) or eye-catching. So the effects of quality on how likely something is to be shared were pretty weak, says Menczer.

    And in real life it would be even worse. "Our model made a lot of simplifying assumptions," Menczer says. "[We didn't take into account] people's biases, people who have financial and political incentives to share fake things. We don't take into account the Facebook algorithm. It's a very complex problem; we're not saying we have the solution, but we're saying here's one part of the problem."

    "I think this is a very good report," Charlie Beckett, a professor of media and communications at LSE who was not involved in the study, told BuzzFeed News. "The way that social networks work, so-called quality information doesn't have some sort of innate superiority at getting people's attention."

    Beckett says it's easy to think of this as a disaster for journalism, but he sees it as the opposite: "Fake news is a problem, but the bigger problem is the overabundance of information in people's lives. This is a good thing for journalism, because journalists are meant to be good at connecting people to the information they want or need."

    At the moment, he says, there's a lot of talk about "news literacy", about teaching people how to spot fake news, and personalising their news intake with apps and emails. "But you're asking the user to make decisions," he said. "And as this study shows, people are human. You can't expect them to treat information as a scholarly thing. It's not supposed to be homework. It's supposed to be part of their daily lives." We're at the beginning of a redefinition of journalism, he says, where the job of the journalist is becoming more about managing the flood of information and filleting the important things for their readers.

    Facebook is already making efforts to combat fake news, he says: "We worry about it, rightly, because it's a nontransparent algorithm. But it does seem Facebook is quite good at giving people a diverse set of sources, and they're coming up with a thing where, if you click on a story and they know it's dubious, they provide you with five other stories on the topic."

    But the media will be key to the fight, he says: "News organisations can help people understand the memes. Where they started, what the significance is to my audience, how we translate them. That's journalism, to me. In the past it was different; we were gatekeepers."