Facebook’s Cambridge Analytica scandal has everything: peculiar billionaires, a once-adored startup turned monolith, a political mercenary who resembles a Bond villain and his shadowy psychographic profiling firm, an eccentric whistleblower, millions of profiles worth of leaked Facebook data, Steve Bannon, the Mercers, and — crucially — Donald Trump, and the results of the 2016 presidential election.
On its face, the incident read as confirmation of many people’s worst fears — that the online platforms we live on are manipulating us, using the personal information we provided in good faith without our knowledge. Add to it that one of those many unintended outcomes could have been Donald Trump’s election and you’ve got the makings of a lasting outrage.
While the main players in the scandal — Facebook, Cambridge Analytica, and Trump — will occupy headlines, they’re just the context — the perfect conditions for genuine outrage. But the Cambridge Analytica scandal isn’t really about Cambridge Analytica at all.
This is a data collection scandal. This is a scandal triggered by a specific incident, but that is broadly about the ways massive companies track us, harvest information from us, and then sell us as coercion targets in sophisticated information campaigns that could be for anything from diapers to mattresses to anti-vax literature.
The story will endure not because of animosity toward political data use but because it perfectly touches upon a deeper anxiety about our online privacy that’s been building for years. Indeed, the Cambridge Analytica scandal could well be the catalyst for a much bigger targeting revolt — a full-scale personal and public reckoning that looks at the way we’ve used the internet for the last decade. It’s a moment that forces us, collectively, to step back and think about what we sacrificed for a more convenient and connected world. And on an internet that feels increasingly toxic it’s hard to look at the tradeoffs we’ve made and feel like we’re getting a fair deal.
You can see this reckoning already begin play out across the media as the focus shifts from Cambridge Analytica’s deeds to more general concerns about privacy and the degree to which our personal lives are catalogued so that we can be targeted by anyone with a dollar (or ruble) to spend online. Over the weekend, Ars Technica reported that Facebook has long requested access to contacts, SMS data, and call history on Android devices, hiding behind a confusing opt-in page as cover.
On Twitter, Facebook users are downloading their data and combing through it to highlight various abuses of trust: that the company kept a record of private phone calls and text message data or that Facebook shared personal information with outside entities for ad purposes. Download your own Facebook, Google, and YouTube data and you'll find years’ worth of personal information: every photo, comment, sticker emoji, video you've ever posted online; every location you’ve visited and IP address you’ve logged in from; every search, every file, and every single website you visited. It's binary confirmation of something we perhaps all have silently suppressed. Every movement you make online — and even where you move about in the world — is fastidiously catalogued, analyzed, and ultimately sold.
In the wake of this revolt, a 2010 clip of Steve Jobs talking about privacy has found new resonance in recent days. In an interview with journalists Walt Mossberg and Kara Swisher, Jobs is asked to weigh in on a then-recent Facebook scandal and lays out his feelings on user privacy, as Mark Zuckerberg looks on in the audience. “Privacy means people know what they’re signing up for, in plain English, and repeatedly,” he says. “I’m an optimist; I believe people are smart, and some people want to share more data than other people do. Ask them. Ask them every time. Make them tell you to stop asking them if they get tired of your asking them. Let them know precisely what you’re going to do with their data.”
Today, the clip is presented as a face-to-face warning from Jobs to Zuckerberg and as proof of sorts that Facebook knew exactly what sort of surveillance tools it was building. Indeed, Facebook’s history of privacy scandals makes the current revelations feel all the more flagrant.
In 2007, Zuckerberg first apologized for its controversial Beacon ad program, which tracked purchases on outside sites and posted that information onto a user’s News Feed. “Instead of acting quickly, we took too long to decide on the right solution. I'm not proud of the way we've handled this situation and I know we can do better,” Zuckerberg wrote at the time. Three years later, in a sweat-filled interrogation at another D conference, Zuckerberg repeatedly dodged direct questions about Facebook’s cavalier attitudes toward privacy. More than 11 years after Beacon, Facebook is still behaving recklessly with our personal information and still making the same apologies — the only difference is that this time the advertising systems are far more complicated and the apologies come in full-page ads in the New York Times.
The Great Targeting Revolt isn’t just about the platforms; some in the ad industry are coming clean too. On Sunday the Verge published an editorial by a former digital marketer suggesting that Cambridge Analytica is only the tip of the iceberg. “The contemporary Internet runs on the exploitation of user data,” the author wrote, before citing examples of corporations flagrantly flaunting user privacy to extract data. It may seem obvious but the revelation is profound and the language is important. Marketing is passive but targeting is active and invasive. A market is a place where things are bought and sold. A target is something that is hunted. Cambridge Analytica, the piece argues, isn’t an outlier but the blueprint for the internet’s vast online ad economy.
Now in its second week, the size of the scandal is still unclear. We know that data obtained by a researcher was improperly shared with political consultants and that Cambridge Analytica did not delete the data upon Facebook’s request, but there’s a whole lot we don’t know for sure. Is this sort of improper data sharing an isolated incident or is this just part of a systemic information-control problem inside Facebook? Did the data get passed around to data brokers, states, or other companies besides SCL and Cambridge Analytica? Was the specific profile data used in any way to help target voters in the 2016 election? More importantly, is this data actually as useful in persuading voters as the targeting firms would have us believe? And are the people at Cambridge Analytica the psychographic targeting evil geniuses they claim to be? Or is it all just a bunch of bullshit — less “mindfuck tool” and more midlevel marketing?
There’s evidence to suggest that the Cambridge Analytica team were, at the very least, likely overselling their influence. “When we first saw the pitch the first thought was ‘Why would I pay all this money for this stuff when I could just get our guy from Facebook to do this kind of thing for us?’” a former senior Trump campaign official told BuzzFeed News. And some researchers and psychologists are dubious about the effectiveness of the Facebook data in political ad targeting. Speaking to Wired UK, one researcher argued that psychographic profiling is light on actual science. Another professor who studied voter behavior inside Facebook between 2010 and 2015 told the publication that the Facebook data gathered through questionnaires like Cambridge Analytica’s “may not measure what you really care about” and could have actually corrupted the ad targeting data the company had previously collected.
But none of that may matter much anymore at this point. The scandal feels like a watershed moment no matter how effective Cambridge Analytica may have been. Not only has the outrage endured more than a full week of news cycles, but it seems now as if in it the Big Tech backlash has reached a critical mass, sparking an unprecedented crisis. It’s caught the eye of lawmakers across the world — Zuckerberg has been summoned by Parliament in the UK and asked to testify in Congress by multiple senators; Facebook is now under an investigation by the Federal Trade Commission about its mishandling of Cambridge Analytica data; and across the world, company executives are getting skewered by governments in nations like the UK and, worse, Singapore, where officials suggest the company has lost their trust. It’s prompted more direct talk — even from Zuckerberg himself — of regulation. Anti-monopoly advocates are calling on the FTC to restructure Facebook, suggesting a spinoff of Facebook’s ad network and even the reversal of the company’s WhatsApp and Instagram acquisitions to make them competing social networks. It even appears that inside Facebook itself there’s confusion and a sense of dwindling morale. And while the blowback isn’t enough to unseat Facebook in the App Store, regular, loyal Facebook users around the world have put momentum behind a #DeleteFacebook campaign — a novel act of online rebellion despite plenty of previous privacy scandals.
Some news events unexpectedly break through and take on new lives and trigger genuine, lasting movement. Moments like these are impossible to predict or control. The Parkland school shooting this February wasn’t an outlier in terms of casualties or sheer scale of horror and yet that particular tragedy’s aftermath was a breaking point and prompted a full-throated rejection of the status quo. The Parkland students were simply fed up. Fed up with the notion of not having control over their own safety. Fed up with a system in which victims of gun violence seemingly have no agency.
While the situations are drastically different, the frustration at the heart of the Cambridge Analytica scandal is similarly about agency and control. For years we’ve read about privacy scandals from the NSA’s Prism program to the data breaches everywhere from Equifax to Ashley Madison. We’re no stranger to the warnings that the platforms we live on harvest our data and that every detail of our online activities acts as the oil that powers the internet’s economic engine and that there’s little we can do, save for opting out completely. But this one lays bare the downsides of this tradeoff — how, despite their pledge to empower us, the platforms we live on have done just the opposite and stripped us of the agency to dictate what happens with our most personal information. Maybe it’s because Facebook feels so personal or because the 2016 election was so divisive, but this scandal is shaping up to be the event that forces the scales to fall from our eyes. We’ve lost control. And we’re fed up.
Every revolution needs to be ignited. Cambridge Analytica feels a lot like a spark.
If you want to read more about Facebook’s data scandal, subscribe to Infowarzel, a BuzzFeed News newsletter by the author of this piece, Charlie Warzel.
Charlie Warzel is a senior writer for BuzzFeed News and is based in New York. Warzel reports on and writes about the intersection of tech and culture.
Contact Charlie Warzel at email@example.com.
Got a confidential tip? Submit it here.