How People Inside Facebook Are Reacting To The Company’s Election Crisis

Many employees feel like they’re part of an unjust narrative that’s spiraled out of control.

In the summer of 2015, a Facebook engineer was combing through the company's internal data when he noticed something unusual. He was searching to determine which websites received the most referral traffic from its billion-plus users. The top 25 included the usual suspects — YouTube and the Huffington Post, along with a few obscure hyperpartisan sites he didn’t recognize. With names like Conservative Tribune and Western Journalism, these publications seemed to be little more than aggregation content mills blaring divisive political headlines, yet they consistently ranked among the most widely read websites on Facebook.

"Conservative Tribune, Western Journalism, and Breitbart were regularly in the top 10 of news and media websites," the engineer told BuzzFeed News. "They often ranked higher than established brands like the New York Times and got far more traffic from Facebook than CNN. It was wild."

Troubled by the trend, the engineer posted a list of these sites and associated URLs to one of Facebook's internal employee forums. The discussion was brief — and uneventful. "There was this general sense of, 'Yeah, this is pretty crazy, but what do you want us to do about it?'" the engineer explained.

To truly understand how Facebook is responding to its role in the election and the ensuing morass, numerous sources inside and close to the company pointed to its unemotional engineering-driven culture, which they argue is largely guided by a quantitative approach to problems. It’s one that views nearly all content as agnostic, and everything else as a math problem. As that viewpoint has run headfirst into the wall of political reality, complete with congressional inquiries and multiple public mea culpas from its boy king CEO, a crisis of perception now brews.

Inside Facebook, many in the company’s rank and file are frustrated. They view the events of the last month and those that preceded it as part of an unjust narrative that’s spiraled out of control, unchecked. Five sources familiar with the thinking inside the company told BuzzFeed News that many employees feel Facebook is being used as a scapegoat for the myriad complex factors that led to 2016's unexpected election result. What the public sees as Facebook’s failure to recognize the extent to which it could be manipulated for untoward ends, employees view as a flawed hindsight justification for circumstances that mostly fell well beyond their control. And as the drumbeat of damning reports continues, the frustration and fundamental disconnect between Facebook's stewards and those wary of its growing influence grow larger still.

Today, the engineer’s anecdote reads as a missed opportunity — a warning of an impending storm of misinformation blithely dismissed. But inside Facebook in July 2015, it seemed a rational response. At the time, the platform was facing criticism for what many believed to be overly censorious content policies, most notably a decision to ban breastfeeding photos which had only recently been reversed. A move to reduce the reach of nontraditional publications seemed certain to trigger a PR disaster at a time when Facebook was consumed by a troubling downturn in its core business metric — person-to-person sharing — and battling Snapchat for new users.

“Things are organized quantitatively at Facebook,” the engineer said, noting that the company was far more concerned with how many links were shared than what was being shared. “There wasn't a team dedicated to what news outlets [were using the platform] and what news was propagating (though there was a sales-oriented media partnerships team). And why would they have had one, it simply wasn’t one of their business objectives.”

Yet that failure to fully recognize a looming problem has engulfed the company in the aftermath of the 2016 US presidential election. In the past month alone, Facebook has disclosed to Congress 3,000 ads linked to Kremlin election manipulation, its CEO has publicly apologized for dismissing Facebook's role in swinging the election as “a crazy idea,” and it has been attacked by President Trump on Twitter. It’s also been criticized for surfacing fake news to its Las Vegas massacre “safety check” page, published full-page apology ads in major newspapers, and been forced to update lengthy blog posts about its handling of the Russian ads when its explanations proved too murky. And then there are the congressional probes — two of them — and a pending bipartisan bill meant to force it to disclose political ads. With the specter of government regulation hanging above it, Facebook seems to have few, if any, friends right now in the public sphere.

The public-facing crisis is playing out internally as well, as employees wrestle with the election meddling that occurred on its platform. Sources familiar with recent internal discussions at the company told BuzzFeed News that plenty of employees are conflicted over the issue and are demanding more clarity about the platform’s exact role in the election. “Internally, there’s a great deal of confusion about what's been done and people are trying to come to terms with what exactly happened,” one of these people told BuzzFeed News.

Three sources close to the company described similar conversations, noting that Facebook staffers feel some sense of responsibility for the platform’s misuse in the election. “One of the things people inside are bemoaning is the fact that the response internally was very, very slow,” one former employee told BuzzFeed News. “That’s because Facebook didn't have the expertise needed to spot it until it happened.”

The employee, who left the company recently, said that Facebook was so focused on US-centric policies and engaging with 2016 election campaigns that it didn’t bother to fully consider foreign interference. “There’s a feeling that this kind of social engineering was happening all over the world before our election — in places like Estonia, Poland, and Ukraine. If there was a less US-focused approach it may have been spotted and acted on in real time,” this person said.

According to a Facebook spokesperson, responding on behalf of the company, "we take these issues very seriously. Facebook is an important part of many people’s lives and we recognize the responsibility that comes with that. It’s also our responsibility to do all we can to prevent foreign interference on our platform when it comes to elections. We are taking strong action to continue bolstering security on Facebook – investing heavily in new technology and hiring thousands more people to remove fake accounts, bettering enforce our standards on hate and violence, and increasing oversight of our ad system to set a new transparency standard for the internet. This is a new kind of threat, even though not a new challenge. Because there will always be bad actors trying to undermine our society and our values. But we will continue to work to make it a lot harder to harm us, and ensure people can express themselves freely and openly online."

But the prevailing viewpoint within Facebook, according to numerous sources, is that the company has been wrongly excoriated for the misinformation and election meddling enabled by its platform. “There are lots inside thinking, 'We're the victims,'” a source familiar with the current climate at the company told BuzzFeed News. “[They feel] that this Russia stuff is bigger than just Facebook’s responsibility — that Facebook is just a battlefield in a greater misinformation campaign and that it’s up to the governments involved to resolve these issues.”

More broadly, multiple sources told BuzzFeed News that some inside Facebook think the blame cast on the company by the media and public feels reactionary and somewhat hypocritical. “Before the election the digital community was complaining that Facebook was this monopolistic power that was overly censorious and buttoned-up. And now the same group is saying, ‘how'd you let Breitbart and fake news get out there?’” a second former employee who recently left the company said. “And they have a point — ultimately it's because the election didn't go the way they wanted. It's worth pointing out that 12 months ago people said, 'I hate Facebook because they don't let all voices on the platform,' and they're upset and asking for Facebook to restrict what’s shown.”

“The view at Facebook is that ‘we show people what they want to see and we do that based on what they tell us they want to see, and we judge that with data like time on the platform, how they click on links, what they like,’” a former senior employee told BuzzFeed News. “And they believe that to the extent that something flourishes or goes viral on Facebook — it’s not a reflection of the company’s role, but a reflection of what people want. And that deeply rational engineer’s view tends to absolve them of some of the responsibility, probably.”

For Facebook’s critics, this view is tantamount to the company’s original sin — one that’s exacerbated by its leakproof culture and what some employees describe as a hive mind mentality.

Moreover, it is largely driven from the top down. CEO Mark Zuckerberg seems to project two perhaps antithetical views: that Facebook has great power to connect the world for the better, but only limited influence when it comes to efforts to destabilize democracy. A source who has worked closely with Zuckerberg said he sees the founder and CEO as approaching Facebook’s role in the election with none of the hysteria that’s reflected in the press.

“He’s treating it with a level of urgency,” this former senior employee told BuzzFeed News. “We’re not going to see a knee-jerk reaction to this from him — he’ll be very restrained with any potential tweaks to the platform because he's more interested in substance than optics.”

“Zuck tends to have a pretty unemotional and macro–level view of what's going on,” another former Facebook employee explained. “He’ll look at data from a macro level and see the significance, but also see that the data shows that nobody wanted to read the liberal media stuff — that [the mainstream media] didn't target half the country with their content.”

For many outside observers, the idea that the social network potentially played an outsize role in election interference by a foreign government is confirmation of their worst dystopian fears. The fact that the Russian ads were likely targeted using personal information provided by users themselves tugs at long-held suspicions that Facebook knows too much about its users and profits wildly from it.

Yet those with knowledge of Facebook’s ad system say that there’s a solid case to be made that the disclosed Russian ad spend — and even the reported millions of impressions those ads received — pales in comparison to the billions spent by political groups in the run-up to 2016 on Facebook’s ad platform and the hundreds of millions of impressions that the platform delivers daily on all types of paid and unpaid content. Basically: Facebook’s unprecedented scale, when applied to the Russian ads, renders the scandal’s impact far less consequential than news reports would suggest.

The greater, perhaps more existential issues, former employees argue, are Facebook’s filter bubbles, the increasing misinformation and hyperpartisan news that flourishes there as a result, and the platform’s role as arguably the single largest destination for news consumption.

Sources familiar with recent discussions inside Facebook told BuzzFeed News there’s some concern that the strong reaction to 2016 election meddling and the desire for fast reform could push the company to assume a greater role in determining what is or isn’t legitimate news. “That Facebook played a significant part as perhaps the most important online venue in this election is not up for debate,” one of these people said. “But what we need to be debating is: What is Facebook’s role in controlling the outcomes of elections? I’m not sure anyone outside Facebook has a good proposal for that.”

Facebook, too, has long been concerned about assuming any sort of media watchdog role and the company’s objection usually takes the form — as it did last week in an interview with Facebook COO Sheryl Sandberg — of its well-worn argument that Facebook is a technology company, not a media company. “We hire engineers. We don’t hire reporters. No one is a journalist. We don’t cover the news,” Sandberg told Axios’s Mike Allen.

Antonio Garcia Martinez, a former Facebook employee who helped lead the company’s early ad platform, worries that the momentum to correct for what happened during the 2016 election will push Facebook a step too far. "Everyone fears Facebook’s power, and as a result, they're asking them to assume more power in form of human curation and editorial decision-making," he said. "I worry that two or three years from now we're all going to deeply regret we asked for this."

This gulf between the way the company sees itself and the way it is increasingly being viewed by outside observers threatens to undermine Facebook’s awareness of crucial issues that need to be addressed, he says.

Fbook exec asked about Russia: “I don’t want to deflect but...Let’s not forget all the good the Facebook platforms bring to the world.”

@NellieBowles / Twitter / Via Twitter: @NellieBowles

To illustrate this, Martinez points to Facebook’s "filter bubble" problem — that the platform’s design pushes its users into echo chambers filled with only the news and information they already want, rather than the potentially unpopular information they might need. “What worries me is that we’ve talked about the filter bubble problem for years now. And the company — and all the other platforms — have largely batted the concerns aside. But finally we’re seeing the filter bubble at work now in a very real way,” he said. Facebook, Martinez suggests, will weather its PR struggles. What remains to be seen is whether the company can learn from the chaos with a better ability to see outside itself.

“I think there's a real question if democracy can survive Facebook and all the other Facebook-like platforms,” he said. “Before platforms like Facebook, the argument used to be that you had a right to your own opinion. Now, it's more like the right to your own reality."

Meanwhile, those inside the company continue to struggle with what, exactly, the company is, and what it is responsible for.

“There are times when people at Facebook would gloat about the power and reach of the network,” a former senior employee said. “Somebody said with a straight face to me not terribly long ago that 'running Facebook is like running a government for the world.' I remember thinking, 'God, it’s really not like that at all.'"

This post has been updated to clarify that Zuckerberg dismissed Facebook's role in changing the outcome of the election as a "crazy idea," not the "fake news epidemic" as previously stated.

Topics in this article

Skip to footer