YouTube has a content crisis — again. On the heels of the company’s child exploitation problem, it finds itself facing a new wave of criticism after high-profile YouTuber Logan Paul posted a video of a dead body while filming in Aokigahara, Japan’s so-called “suicide forest.” The Logan Paul controversy is just the latest for a company that has increasingly had to contend with criticism over what kind of content is appropriate on its platform — and how it unevenly applies its own community guidelines.
YouTube, after a decade of being the pioneer of internet video, is at an inflection point as it struggles to control the vast stream of content flowing across its platform, balancing the need for moderation with an aversion toward censorship. In the past 12 months alone, it has been embroiled in controversies including anti-Semitic rhetoric found in videos of its biggest star, PewDiePie, an advertiser exodus over videos featuring hate speech or extremist content, and the disturbing and potentially child-exploitative content promoted by its algorithm. With every new misstep, it has alternately angered the creators it depends on for content, turned off advertisers, and confused users about how, exactly, it makes decisions about which videos can remain on its platform, what should be taken down, and what can be monetized. The Paul video is just the latest manifestation of that struggle.
In this case, the sensational video of a dead body, an apparent death by suicide, was live for more than 24 hours before being taken down by Paul himself after mounting public backlash. (Paul’s PR representative did not return a request for comment.) In that time span it was viewed more than 6.3 million times, according to New York magazine. The video fits within a larger pattern of controversial content and highlights how YouTube has created a system of incentives for creators on its platform to push boundaries.
“Let’s be honest, this flare-up on Logan Paul is going to die out eventually,” Sarah Roberts, an assistant professor at UCLA who has been studying content moderation for seven years, told BuzzFeed News. “But there’s a bigger conversation to be had: To what extent is YouTube overtly and tacitly encouraging individuals to push on the outrageousness factor [in producing content]? Do they need that to keep the engagement going?”
YouTube on Tuesday acknowledged that the video did violate its policies for being a graphic video posted in a “shocking, sensational or disrespectful manner.” “If a video is graphic, it can only remain on the site when supported by appropriate educational or documentary information and in some cases it will be age gated,” a company spokesperson wrote in an emailed statement to BuzzFeed News.
But the Logan Paul incident highlights the consistently inconsistent application of YouTube’s content moderation rules. YouTube did not respond when asked if it had initially reviewed and approved the video to remain on the platform. According to a member of YouTube’s Trusted Flagger program, however, when the company manually reviewed Paul’s video, it decided that the video could remain online and didn’t need an age restriction.
YouTube also said when it removes a video for violating community guidelines, it applies a “strike” to a channel; even though Paul deleted his own video, it gave his channel a strike. If a channel accrues three strikes within a three-month period, YouTube shuts the channel down, per the company’s community guidelines. Notably, Paul had demonetized the video when he first posted it — meaning neither he nor YouTube earned any advertising revenue from it. On Wednesday, Paul tweeted that he's "taking time to reflect" and plans to take a break from his vlog. He didn't specify how long he plans to step away.
Paul’s video isn’t something artificially intelligent moderation could catch on its own, two experts with a focus on content moderation told BuzzFeed News. “What is obscene is having shown and been disrespectful about the body of a suicide victim,” said Tarleton Gillespie, who studies the impact of social media on public discourse at Microsoft Research. “This is the kind of contextual and ethical subtlety that automated tools are likely never to be able to approximate.”
What’s more, the decision that Logan Paul crossed the line is one that fundamentally involves an exercise of moral judgment, according to James Grimmelmann, a professor of law who studies social networks at Cornell. “You have to look at what's considered decent behavior in the user community YouTube has and wants to have,” Grimmelmann said. “You can't just turn a crank and have the algorithm figure out your morality for you.” In that sense, YouTube did ultimately make a value judgment on the Logan Paul video, based on the reaction of its own community, by publicly saying it violated its policies.
Of course, that’s not how the company wants the public to view its role. YouTube has remained largely silent on the fiasco, while Paul has issued two apologies. “Firms have done such a good job of positioning themselves so that when something like this happens, they can wash their hands of it and say, ‘We’re just the dissemination channel,’” said Roberts. “But I would push on that and ask — what’s YouTube’s relationship with Logan Paul?”
Paul is a marquee YouTube star. He is a main character in The Thinning and Foursome, two ongoing YouTube Red Original series — high-quality exclusive shows that the company distributes on its paid subscription service, YouTube Red. Paul has had a YouTube channel since 2015, and in that time he’s accumulated 15 million subscribers and nearly 3 billion views. YouTube knows Paul’s irreverent style of video, and Paul knows what does well on the platform. “In this case, this guy is a top producer for YouTube,” said Roberts. “It becomes harder to argue the video wasn’t seen in-house.”
Compounding the problem is that YouTube itself likely has no way of knowing exactly what content is on its platform at all times — especially with users uploading nearly 600,000 hours of new video to YouTube daily. “The problem with current digital distribution platforms is the micro-targeting of content to users,” said Bart Selman, a Cornell University professor of artificial intelligence. “In fact, a well-tuned ranking algorithm will make sure that extreme content is only shown to people who will not feel offended — or may even welcome it — and won't be shown to others.” The bubble of micro-targeting is pierced when disturbing videos go viral and attract a lot of public attention and media scrutiny. But that’s the exception, not the norm.
And that leaves the public to exert pressure on YouTube. Still, exactly how YouTube’s complex system of human moderators, automated algorithms, policy enforcement, and revenue generation work together to police and promote videos remains a black box — and that’s an issue. “Those key ingredients are under lock and key,” UCLA’s Roberts said. “One positive income of these incidents is that the public asks new questions of YouTube.”
“We are all beta testers and a focus group, including how content moderation is applied,” Roberts continued. Now, YouTube will likely throw even more resources at its content moderation problem and communicate its strategy even more loudly to the public — something it has already begun to do — in an effort to outpace any regulation that might come down on the platform.
This story has been update to clarify that YouTube applied a strike to Logan Paul's channel, per its Community Guidelines, for violating its rules on violent or gory content posted in a shocking, sensational or disrespectful manner.
Davey Alba is a senior technology reporter for BuzzFeed News and is based in New York.
Contact Davey Alba at firstname.lastname@example.org.
Got a confidential tip? Submit it here.