back to top

Here's How YouTube Is Spreading Conspiracy Theories About The Vegas Shooting

And increasing the chances that users stumble down an algorithm-powered conspiracy video rabbit hole.

Posted on

In the immediate aftermath of the Las Vegas shooting, Facebook and Google News faced harsh criticism for surfacing misinformation. But they weren't the only ones.

The video hinted at the possibility that the shooter was acting on behalf of the government or another agency. It has since been removed, but AMTV's other shooting-related videos include "What if Las Vegas Shooter was a Gov't Agent & Not a Madman."

Meanwhile, a search for information on the gunman, Stephen Paddock, yielded videos from conspiracy sites attempting to politicize the shooting.


Throughout Monday, YouTube searches for the shooting were littered with results containing unproven theories trying to link the shooter to the anti-fascist movement.

The 12th search result on YouTube for “Las Vegas shooting” rn is someone talking about an eyewitness’ body language…

One possible explanation for the glut of conspiracy content in the aftermath of the shooting is that there was simply a lack of reliable information being uploaded — vetted news reports from trusted outlets often take longer. However, the problem persisted long after reliable reports were available: On Tuesday night — nearly 48 hours after the shooting began — YouTube was still surfacing conspiratorial content high in search results.

A YouTube spokesperson provided BuzzFeed News with a statement touting its dedicated news page, which users would have to navigate to instead of simply searching for something like "Vegas shooting."

"When it comes to news, we have thousands of news publishers that present a variety of viewpoints available on our news channel,," the statement said. "When a major news event happens, these sources are presented on the YouTube homepage under ‘Breaking News’ and featured in search results, with the label ‘Top News.’"

While none of the conspiracy videos appeared under the "Top News" label, YouTube's site design creates some confusion. First, it only includes two "Top News" links (users have to click to unfurl more), and the site displays the un-vetted news sources directly below. Thus, less reliable links can appear third in the results:


As of late Tuesday, the third result for "Las Vegas shooting" on YouTube was a video purporting to prove that the attack was a "false flag." By Wednesday morning it had more than 1 million views.

As a reader just pointed out, it's been two days and the third YouTube result for "las vegas shooting" is a false f…

The video now appears to have been removed.


The high placement of conspiracy videos in search results is significant at a time when users are searching for reliable information about breaking news events. But perhaps most importantly, thanks to YouTube's autoplay feature for recommended videos, when users watch one highly ranked conspiracy video, they're more likely to stumble down an algorithm-powered conspiracy video rabbit hole.

For example, BuzzFeed News clicked on that fourth search result, a video by "Squatting Slav TV" with over 100,000 views, in an incognito window — which surfaces search results without taking into account past browsing history.

The channel's "About" page says it "provides exclusive original content and interviews with some of the best known voices in the world of economics and precious metals." The video suggests that "the mainstream media's narrative that alleged shooter Stephen Paddock was a 'lone gunman' is 100% patently false" (despite no proof from law enforcement).

A look at YouTube's right-side recommendation bar shows videos that erroneously suggest there was a a second shooter, as well as videos from Alex Jones' site, Infowars, which frequently peddles conspiratorial content.

For its part, YouTube stands behind the varied opinions and views on its platform. After BuzzFeed News inquired about them, a number of the videos in question were suspended, though others are still searchable and some have hundreds of thousands of views. And while many of the videos may comply with YouTube's community guidelines (they don't contain nudity, threats of violence, spam, or copyright infringement), they offer YouTube's more than 1 billion monthly active users easy access to a vast array of factually incorrect and extremist content. Just keep clicking.

Charlie Warzel is a Senior Technology Writer for BuzzFeed News and is based in Missoula, Montana

Contact Charlie Warzel at

Got a confidential tip? Submit it here.