How Child Porn And The Other Awfulest Things Ever Get Scrubbed From The Internet

Machines are long way from being able to automatically remove the most awful images mankind has to offer — child porn, beheadings, forced bestiality — from our favorite sites and social networks. An invisible workforce has emerged to look at it so we don't have to. (Warning: You may find this piece upsetting.)

“Do you know what child porn is?” she asked me. A string of god-awful words came out of her mouth. “Infant decapitation” were just two of them. Cindy* spent years as a member of the CyberTips team at the National Center for Missing and Exploited Children, processing thousands of images, videos, emails, social network profiles and more that were flagged as possibly criminal content.

There’s a lot of stuff on the internet, and every day more gets added to it. Not all of it is kosher. Child porn. Narco executions. Beheadings. So an invisible workforce has emerged to help scrub the festering filth, one that is often poorly paid, in entry-level positions with little support or job security. As an interview earlier this week with a former Google worker showed, the psychological costs can be high.

“We were the 911 for the internet. We handled every single report on an internet child porn,” Cindy said. “Man, I wished I worked at Google compared to what we were dealing with. Every week we saw about 25,000 reports and every single report had at least 200 to 500 images and videos to review.”

She worked the afternoon-to-evening shift. Going to work meant turning on a computer and sorting through a long queue of reports that came from a number of tech companies, as well as from concerned individuals. On a normal day, she said she could process 100, maybe 200 reports, although it felt “never ending.” She often saw tech companies overzealously reporting, erring toward an overabundance of caution: Pictures of Marge and Bart Simpson having sex, for instance, was classified as potential child porn. But she also saw the real stuff, every day.

“To have a naked child image — that’s not necessarily a crime, if you don't identify them and know their age,” she explained.

Looking at disturbing material for a living can make some workers feel “nothing.” Others say it will “rot you from the inside out,” says Heather Steele, the director of an organization that helps tech companies understand the impact of such material on workers. Experts say that the most pernicious effects of repeated exposures to horrific images has a cumulative effect, and in several interviews with current and former workers in this field, people reported desensitization and isolation as the most common side effects. And many complain that they feel like hired eyeballs, the digital equivalent of day labor.

“When you're not close to the development process you are expendable as a paper airplane and they let you know it,” said one current employee who analyzes this type of content for a community moderation platform in Silicon Valley.

There’s no trade group — or even a common job title — for this kind of work. There's no one advocating for them, and more significantly, there's no way of tracking exactly how many there are. But between the behemoth tech companies like Google and Facebook, and outsourced data firms like the United States-based Telecommunications On Demand and Caleris, two companied mentioned in a 2010 New York Times article as processing millions of images, it seems safe to say these workers number in the thousands.

This is not to say that every individual in this line of work has had a bad experience: the ability to handle such a mentally demanding job differs from person to person. And tech companies say that they do offer special benefits to employees who view disturbing content for a living. Facebook has a “safety team” that is tasked with reviewing the most sensitive material, and according to a spokesman, they offer “in-house training, and also, counseling for our employees.” A Google spokeswoman told me that the one-year contracts (which can frustrate those looking to stay on for longer) were designed to ensure that no one held the most brutal jobs indefinitely. Also, Google brings in independent counselors to talk to teams about secondary trauma — the kind of trauma that comes from seeing abused people and not being able to help.

In her role at the National Center for Missing and Exploited Children, the country’s clearinghouse for every report of child porn logged by Google, Yahoo, Microsoft and others, Cindy was on the front lines of the daily war, seeing the worst of the worst. Two teams within the organization handle internet content: CyberTips, where companies are legally required to send red-flagged content, and the Child Victim Identification Program, which works to help find the victims. It was at NCMEC that the Safeguard program, a mandatory counseling and training protocol for helping employees deal with such disturbing imagery, now used by some law enforcement, was born in the early 2000s.

But Cindy told me that even with the rules in place, she left feeling let down by management. (It must be said that, this individual story aside, NCMEC is highly regarded amongst mental health professionals for their approach.) It wasn’t due to a fault in the hiring process, though: When she applied for the job on the then-seven person CyberTip team, she was shown some content and ordered to wait two weeks before accepting the position. With a law enforcement background, she thought she could handle the worst of the worst. She was only sort of right.

Part of her job was guessing ages, tracking down the identities behind screen names or IP addresses, and then passing on the info to the relevant authorities. Oddly, with government-level search engines at her disposal, she said the best way to find someone was often typing in an email address with quotations into Google.

From her perspective, some tech companies did a better job handling their vast swaths of uploaded content than others. She singled out Microsoft, MySpace and MyYearBook as good partners. Google earned a “meh” rating. And at the time she worked on the CyberTip team, Facebook only had two people assigned to vet child porn, a woefully inadequate labor force. The worst, though, was Ning, the private social network, which leased their servers from Amazon.

“That was hell on earth," said Cindy. "It became the pedophile network. We’d break down the images and it was always going back to Amazon. But [Amazon] was good, they immediately cut ties.”

The National Center for Missing and Exploited Children had mandatory counseling sessions, as well as a lengthy training process. “You need to have the right mindset. I left every day grateful that I had never experienced anything like that in my life,” she said.

But as Cindy's experience with management began to sour, the workplace stresses began to mount. They were experiencing a massive surge in the number of reports coming their way — a ten-fold increase — and the cumulative effect of seeing hard-core child pornography began to eat at her. Her problems with her bosses were, she believed, due to their own desensitization as a result of years spent in the field. “All they cared about were these reports,” she said. “We might as well have been machines.” Her bad feelings got worse when she left the job: the images would come back to her at random times and she didn’t have co-workers to commiserate with.

“You have dreams about these kids, the recall that comes to your mind, you can see them. I had to go on an anti-anxiety medication — your anxiety increases,” she said. While on the phone call, Cindy told me she could still see the images of child victims being raped with vivid detail — something that always happens when questions of her previous career are inevitably brought up.

She paused. “The worst thing,” she said, “is you would watch the kids age through the abuse.”

NCMEC deals with the truly worst of the worst images, and the tech workers I spoke with did not report that degree of trauma. Two former YouTube workers, for instance, described working conditions as okay: They were warned about the job prior to being hired; they worked on a team; and they saw more mundane images than the ones that were truly shocking. At YouTube, videos that were flagged were separated into two tiers, and often one could gather the content by looking at thumbnails, rather than watching the whole thing. There are teams watching questionable content around the clock all over the world.

Another downplayed the number of upsetting images. “It's disturbing, but only a small fraction of the videos are very disturbing; most of it is porn or innocuous,” he said.

According to Heather Steele, director of The Innocent Justice Foundation, an organization that has provided training on how to help employees tasked with dealing with child porn since 2010, it is crucial to properly prepare workers and to keep tabs on their emotional well-being during and after employment.

“There are things that make it worse and things that make it easier,” she explained. Working in a dark room with social or physical isolation and crazy hours can all can exacerbate reactions to imagery. Mandated breaks, flexible time off and access to counselors can improve conditions. “It’s a cumulative build-up, then negative effects build up over time,” she said.

Steele’s foundation has worked with teams at GoDaddy, Facebook, Yahoo, MyYearBook and other tech companies to provide models for how to treat employees. Proper screening at hiring can go a long way (people with unresolved issues around child sexual abuse are obviously not great candidates), but Steele also warned that personal issues like pregnancy and going through a custody battle can wreak havoc down the line. She was hopeful, though, that both law enforcement and tech companies are becoming more aware of these issues. “I think they want to do something. Up until now, people haven’t realized how traumatic it is,” she said.

Why continue to put human beings through this when the costs are so very high? No one I spoke with seemed hopeful that these filtering processes could be fully automated anytime soon. There are technologies like PhotoDNA and web crawlers that can identify and sort some kinds of content, but humans are still needed to differentiate the merely unpleasant from the criminal. One of the ways that victims can be re-traumatized, noted Marsha Gilmer-Tullis, a children’s advocacy director at NCMEC, is through the circulation of old pictures. “The image is frozen in time. We don't know, as these young kids become older, what are the worries,” she said. Without this specific form of digital labor, the internet would be awash in such material damaging both to the viewer and to the subject.

And perhaps for the right person, being a human filter could be a good job. After publishing the Google’s account earlier this week, at least five people emailed me asking how to get employed there. “I have understood that human beings are inherently evil for a long time now,” wrote one job seeker, “so after that realization, nothing shocks me.”

Update: A previous version of the story stated that Ning was defunct. It is, in fact, alive.

Skip to footer