Earlier this summer, Facebook's data science team came under intense scrutiny after publishing a study in an academic journal that revealed it had knowingly served up different content in 600,000-plus users' News Feeds for a psychological experiment focused on users' emotional states.
Today, in his first blog post in years, OkCupid co-founder and data scientist Christian Rudder posted a tongue-in-cheek response to the Facebook study backlash titled, "We Experiment On Human Beings!" The post documents some of the dating site's algorithmic experiments, including one where OkCupid intentionally deceived users on the quality of their matches. According to the post:
We took pairs of bad matches (actual 30% match) and told them they were exceptionally good for each other (displaying a 90% match.)† Not surprisingly, the users sent more first messages when we said they were compatible. After all, that's what the site teaches you to do.
But we took the analysis one step deeper. We asked: does the displayed match percentage cause more than just that first message—does the mere suggestion cause people to actually like each other? As far as we can measure, yes, it does.
When we tell people they are a good match, they act as if they are. Even when they should be wrong for each other.
The post (which you can read more about here) is a bit sardonic and overtly casual about the idea of a company experimenting with user data and algorithms. Early on, Rudder argues, "Guess what, everybody: if you use the Internet, you're the subject of hundreds of experiments at any given time, on every site. That's how websites work."
Shortly after the post went live Rudder, who just finished a book on tech company data science, told BuzzFeed he understands criticism of the Facebook study but that this sort of experimentation is part of the fabric of nearly every media and tech company. "I understand a lot of the issues and why the anger is there. It's confusing. But people also need to understand that every website, every part of modern web development — nobody launches a redesign without testing on different users. It's just not unusual at all and I can't remember a time we launched a significant feature and didn't test it on 10, 20, or 30% of users."
For some, especially those in academic spheres, Rudder's post served as another reminder of the glaring disconnect between those at the helm of these enormous stockpiles of data and the expectations their users. At stake in this criticism is the issue of ignoring ethical considerations simply because that's the cost of doing business in the "era of big data."
But Rudder appears to argue that users feeling manipulated is somewhat inevitable when dealing with algorithms of any kind. "Look, I think the Facebook thing, it got a hold of people with the narrative that they were controlling minds or purposefully making you sad and yeah, it's understandable how that can be interpreted," he said. "But at same time at OkCupid if the algorithm changes, yeah, they go on different dates, discover different people, maybe even marry somebody different. But that's not me playing god, that's just a fact of the service. Any decision the site makes has those implications because people are really using these services in their lives."
Perhaps most interesting about Rudder's post though is the wider initial reaction. While the Facebook study outrage brought about its own news cycle, as of this writing, the OkCupid post has yet to resonate with a wider audience. Though some of that is attributable to Facebook's size and reach — which is considerably larger than OkCupid's — it might also be an issue of tone. Rather than feign ignorance, Rudder's post and rather brash transparency appears to take some of the wind out of the outrage cycle.
When asked if he had any advice for Facebook's data science team, Rudder largely defended the social network's study. "There's a lot I don't like about the site but this is one thing I think they did just fine. They're going to get blowback simply because people just hate them. Facebook has that relationship with the internet commentariat. So I think publishing it in an academic journal is way better than a casual 'check it out, y'all! We experiment on you guys' blog post!'"
Rudder also made the argument that partnering with academics was an effort to try to use their research for public good. "I'm just glad I don't work there," he said. They're public enemy number-one when it comes to anything to do with data. In some ways they deserve it, though, given the way they advertise, just jamming information into the feed."
Charlie Warzel is a senior writer for BuzzFeed News and is based in New York. Warzel reports on and writes about the intersection of tech and culture.
Contact Charlie Warzel at email@example.com.
Got a confidential tip? Submit it here.