Wikipedia’s Kiddie Porn Problem

Wikipedia’s self-policing isn’t working.

If we think the only sexual photographs people under the age of 18 take are the sexts they send to each other, we’re horribly naïve. A whole Internet subculture thrives on photos and videos of teenagers take of themselves that may be illegal, but whose subjects look close enough to 18 that it’s unlikely anyone could ever be prosecuted for looking at them.

On just about any amateur porn site with user uploads, a certain portion of the material is going to be child pornography. That’s just the way it is.

Among the sites hosting this kind of material is the giant, vital hub of information managed by Wikimedia. And as with elsewhere on the Internet, it’s difficult to police. How are you supposed to decide what’s legal, particularly when most people’s impulse is to avert their eyes?

But you don’t have to spend hours browsing jpegs to find it on Wikipedia. It’s there in black and white text. There’s the poetically named Erotic_Human_in_bathroom.JPG, and there’s Human_Penis.jpg, for one. These were uploaded by users who have the years 1995 and 1996 in their user names, respectively — which can be construed as fairly obvious evidence they’re under the age of 18. The latter image has been on Commons for nearly a year. Another photo was taken on a Nintendo 3DS and is actually named Boys_penis.jpg. These files are clearly posed sexual photos of underage boys, yet they have all gotten past users who are supposed to police themselves.

The most egregious example of all is a photo of a child uploaded last October, called M_penis.jpg. “A boys penis. erected,” reads the description.

In late April of 2010, Wikipedia was hit with the sort of public relations crisis that no online media outfit wants to face: The company was “DISTRIBUTING CHILD PORN,” according to a Fox News report. Fox, which had been tipped off by Wikimedia co-founder Larry Sanger, who has taken occasional potshots at the foundation since leaving (or being forced out) in 2002, reported that Wikimedia Commons was hosting “explicit and detailed drawings of children performing sexual acts.” The Fox report wasn’t the first time Wikipedia had been targeted for hosting what some considered child pornography. And it wasn’t the first time that one of its executives, Wikimedia deputy director Erik Möller, had called out for suggesting that sexual contact with children could be acceptable in some cases.

Your average media company would have cleaned house, but Wikipedia operates under its own rules, unfettered by shareholders or concerns about its reputation. And so the foundation pushed back, forcefully defending itself and Möller and denouncing Fox’s “deliberate misrepresentation of reality.”

Behind the scenes, however, Wikipedia founder Jimmy Wales himself began to delete some of the images. Fox noticed and declared victory.

Wikipedia’s users, of course, noticed too. And they freaked out. Wales was suddenly an evil antidemocratic censor out to destroy the site and everything it held dear. Wales gave into their condemnation, and revoked a bevy of his own content-editing privileges.

The Wikimedia Foundation then ordered a study on “controversial content.” It found “many thousand sexual images.” It made a few recommendations. They were pretty much rejected by users. Wikimedia took no action, but the controversy practically vanished from public discourse.

But the dicks are still there. In all the time I’ve spent looking through sexual images on the site (for this article! for this article!), I think I’ve probably only seen a small percentage of the penises on there. Considering how many images aren’t categorized, it’s probably impossible to see much more than that. However, I’m certain I’ve seen child pornography in this small sample. And not the questionable drawings Fox was talking about — the real McCoy.

I’m definitely not the first person to have seen M_penis.jpg — at least two other people have, because they’ve edited the page. One person brainlessly added a tag, categorizing the photo under “human penis.” Apparently this user has seen so many penises on the site that it didn’t even register that he was staring at child pornography. The other user added a template suggesting the image was “low quality.” Not that there was anything illegal about it, mind you, just that it was “very small, unfixably too light/dark, or may not sufficiently demonstrate the subject of the picture.” This person looked long enough at the image to determine he didn’t think its composition was very inspired, but again, it didn’t dawn on him this was child porn. Or, you know what? Maybe it did. Maybe it did and, when this user considered the site’s history when people have suggested pornography deletions, he cynically concluded that the only way the image would ever be deleted, despite it clearly being child pornography, was if it was done on the grounds that this child porn wasn’t of high enough quality. Maybe that’s what happened.

I notified Wikimedia’s administrators of these photos, both to see how they would respond and because it seemed like I was obligated to do so. But it’s not entirely clear, especially for new users, how one goes about this.

Though the Commons notes in some places that it is supposed to adhere to federal and state of Florida law, there are no official guidelines or policies about child pornography specifically, and there are no administrators or moderation teams specifically tasked with finding or deleting these photographs. I ended up using the same standard “nomination for deletion” link that are used for the most benign removal requests on the website. Commons also has a process for “speedy deletions,” which is apparently used to fast-track problem images to a quick removal. According to the policy, certain types of images that are “obvious cases” and likely to enjoy “broad consensus” for removal can be speedy-deleted, such as duplicates, promotional content, and copyright violations. Curiously, child pornography is not among these.

It took a full week for a site administrator to see my deletion requests and decide he felt like carrying them out. Finally, five months after he uploaded it, the photo of that prepubescent boy’s penis was gone from the website.

I called Wikimedia for comment on this story and asked why it took so long for the website to delete these photos. “That’s not something you would normally see on the site for so many days,” said Wikimedia spokesman Jay Walsh. “Usually these are the sorts of things that our volunteers would be tracking pretty assiduously and watching those kinds of images more so than others.”

An e-mail from George Chernilevsky, the administrator who followed through on my deletion requests, seemed to contradict this. To Chernilevsky, it didn’t seem out of the ordinary that nobody would bother to delete the photos within a week. “Usually I do a review of a DR nominations after a week and don’t review nominations of the first day,” he wrote.

One would think the website would ban users who have been found to upload child porn. In the case of these four photos, however, the users were not banned and remain on Commons with their full user permissions. According to Walsh, there have been instances of child-porn deletion in the past in which “volunteers would contact and notify the foundation about their action,” he said, “and also those users who uploaded those images would probably be banned, in some cases banned indefinitely from the project.” In certain instances, Walsh said, the authorities have been contacted. (I contacted the FBI’s national press line, but the person who answered the phone told me they wouldn’t comment on whether it has ever investigated the website.)

As far as I can tell, Chernilevsky didn’t ban the users, notify the foundation, or alert law enforcement when he deleted the four images I found. When asked why he didn’t ban the users, he told me I could take the question to the Commons’ Administrators’ noticeboard if I had a problem. “However, deletion of content is enough for some users.” After talking with Walsh, I asked Chernilevsky if he has ever notified the Wikimedia Foundation when deleting child pornography or knew of any administrator who has done so. “No, never,” he said. The users have still not been banned.

Both Walsh and Chernilevsky stressed that discovery of child pornography on Wikimedia websites is rare. Chernilevsky said he had never seen a pornographic photo of a child so young as the prepubescent boy on the site previously, and estimated that child porn makes up “0.1-0.2 percent” of all his deletions. Yet when I took another cursory glance into Commons, I found more of it.

I found two photographs of underage boys’ penises used on Wikipedia to illustrate medical conditions that aren’t exclusive to children. One, described as the penis of a 15-year-old, is used on the Wikipedia article for dorsal slit, a type of foreskin incision, and the Chinese-language Wikipedia entries for “penis” and “male reproductive system.” Another, described as a penis of a 13-year-old is used on the German Wikipedia page for Langerhans cell histiocytosis, a rare disease.

Then I found Commons keeps the categories “topless adolescent girls” and “nude adolescent boys” on its site. It’s unclear why it would need such categories, but when I went to the latter, I found three pornographic photos of another teenage exhibitionist. They’ve been on the website and even categorized under “nude adolescent boys” for all users to plainly see since July. Yet nobody has bothered to delete this child pornography. Another item whose accurate Chinese description translates to “minor male erection photo” has been on the site since June. In total, I’ve come across seven more child porn photos since my first deletion requests. I put in deletion requests for these photos as well, but they remain on the site. (Update: They were deleted after the posting of this article)

There is something truly rotten in the culture of its volunteer user base that makes it reluctant to properly police itself on certain subjects, and porn is certainly one of them. At the end of his e-mail to me, Chernilevsky gave me a list of three things to keep in mind:

1. Almost all Admins are very busy in real life.
2. An Admin has technical possibility, but isn’t obliged to delete and block all and always.
3. [Wikimedia is] not censored.

But Wikipedia and Commons volunteers could find and swiftly delete child pornography if they really wanted to. But they face few of the worries — about reputation, or advertiser — that face other media companies, and have simply chosen to opt out even of this set of online norms.

Despite its massive success, Wikipedia is constantly criticized for its shortcomings. The public voices regular doubts about its veracity and, and some treat the site as a running joke. And yet Wikipedia plays an ever-greater role in how people get their information. If a print encyclopedia were found to have pornographic drawings depicting children in its volumes, the ensuing controversy would likely ruin its publisher. Of course, the idea of “print encyclopedia” already feels like a dusty concept from a forgotten era. There is probably just one encyclopedia in our lives now. It’s the one that where men like Jiffman compete to model for the auto-fellatio page, and it’s sprinkled in the seams with child pornography.

Jack Stuef is, among other things, a contributor to the Onion. He tweets here.

buzzfeed.com

Check out more articles on BuzzFeed.com!

Facebook Conversations
          
    Now Buzzing