BuzzFeed News has reporters across five continents bringing you trustworthy stories about the impact of the coronavirus. To help keep this news free, become a member and sign up for our newsletter, Outbreak Today.
“I’m not a bot,” the anonymous Twitter user says, at the beginning of a message exchange in which he would eventually send a screenshot of his map location and a photo of the inside of his fridge.
This account had no identifying information, a picture of the Incredible Hulk as its profile picture, and had recently sent a pro-UK government tweet that was word-for-word the same as tweets sent by dozens of other Twitter accounts:
“Journalism is missing the ‘mood’ in this great country of ours - the United Kingdom. We do not want or need blame,” the tweet said. “We do not want constant criticism of our government who are doing their very best in a very difficult and unprecedented global emergency.”
A longer version of this message has been shared tens of thousands of times on Facebook and Twitter in recent weeks, including by high-profile public figures including Lord Alan Sugar.
Since then other high-profile figures, including several senior journalists and the football broadcaster Gary Lineker, have alleged that “propaganda bots” — automated accounts — have been spreading this same message in order to drum up support for Boris Johnson’s government.
Bots do exist, and there have been several concerning stories in recent years about foreign bots attempting to influence elections in the UK, US, and elsewhere.
But a lot of the time, what looks like foreign bot activity is nothing of the sort.
The truth is often something even harder to get your head around — people voluntarily choosing to copy and paste identikit slogans on social media to spread a partisan message or simply wind up their opponents.
As a wind-up, it works — the original so-called bot accounts are generally tweeting to very few people, but tweets calling them out have gone mega-viral.
The Incredible Hulk account that shared the pro-government message said it was not a bot.
Bot accounts are often recently set up, focus on one specific issue, and have a poor command of English if based in Russia or elsewhere.
But this particular account had tweeted 11,000 times since being set up three years ago, mostly sharing graphs of currency fluctuations rather than pumping out pro-government propaganda.
When asked more questions, the user gave a full name, saying he was based in Leicester in England. A quick Google search showed that a person with that name lives in Leicester, although that meant the Twitter account could still be run by an impersonator.
The account then sent a Google maps screenshot of its location — which still didn’t prove the user was who they said they were.
The account’s messages were sent in an English that was colloquial but seemingly that of a native speaker: “Pro brexit, not arsed either way about bojo [Boris Johnson]. I’d say I was more anti [Jeremy] Corbyn than anything else”.
The user said he didn’t use Facebook, did not want to email from an account with his name, or share a photo of an identifying document like a driving licence.
He was asked to send a photo of any identifiably British objects in his kitchen. Within a couple of minutes he sent back a photo of his fridge packed with reliably British items such as Robinson’s fruit squash, Tesco baked beans, and “Dairylea Dunkers” — breadsticks with a creamy dip.
Reverse image searches of the fridge photo using Google as well as Yandex, a powerful reverse image search engine based in Russia, showed zero results — but the image could still have been sent to the user on a platform like WhatsApp.
And so: It was theoretically possible this was a propaganda bot with a native’s command of the English language, impersonating a real man in Leicester, with a detailed knowledge of British politics, a long and varied Twitter history, with easy access to a fake GPS screenshot and a ready-to-go photo of a fridge containing Dairylea Dunkers.
But it was most likely a man in Leicester.
And if he’d been paid by the government to pump out propaganda, it seems an unusual decision to engage extensively with a journalist.
Why would anyone use Twitter under a pseudonym, using a picture of the Incredible Hulk instead of their own face?
“I’m a day trader on the stock market as will be clear by all the weird charts...the stock market has some dodgy characters on it, and I prefer to stay anonymous,” the user said.
And why did he tweet this pro-government message, word-for-word the same as the message tweeted by so many other social media accounts?
“I shared it because (I suspect like many others) it’s hilarious, and the left wing twitter police make me laugh.”
The original, longer version of the post said: “Let's get this message VIRAL and they might just take note.”
The message has certainly gone viral, with 112,000 people sharing a version posted by a user called Andrew Wilson on Facebook on April 23.
But rather than clicking “share” on Facebook, many people including the Incredible Hulk account have tried to make the message go viral in a different way — by copy and pasting a chunk of the text onto Twitter.
These identikit tweets generally came from accounts with a small number of followers and had few likes or retweets.
But lots were then screenshotted and shared by prominent accounts — reaching a far larger audience.
Many of these screenshotted tweets have replies underneath accusing them of being bots, often by users with #FBPE ("Follow Back Pro Europe") in their username.
In the UK, “bot” accusations generally fall along partisan lines, with left-wingers and pro-Europeans accusing pro-Conservative, pro-Brexit accounts of being bots.
Although claims of pro-government bot networks often reach a huge audience, there is not always the evidence to back them up.
The UK government was recently accused of creating a network of Twitter accounts posing as health service staff, in a series of tweets shared tens of thousands of times.
The fact-checking site Full Fact said there is "no evidence" to back up these claims, which were also denied by Twitter and the government.
A lot of the supposed “bots” of recent weeks are definitely real people.
Linda, a Twitter user with just eight followers, was accused of being part of a “propaganda campaign” last week.
BuzzFeed News tracked her down on Facebook.
“I am not a follower of any party and certainly not a so-called bot,” she said. “Someone posted something positive on Facebook and I shared it on Twitter.”
“I couldn't fit it into one post, so I divided it.”
Another Twitter user shared the text for a different reason — to wind people up.
“I just posted the tweet because people are saying bots are doing it,” said Josh. “Just made me laugh.”
It also wouldn’t be the first time that Conservative supporters have been posting botlike messages as a joke.
During the 2019 general election, thousands of nearly identical messages of support appeared on Boris Johnson’s Facebook pages.
This all seemed extremely suspect, and many of Johnson’s high-profile critics were quick to allege sinister activity.
But many of these messages came from genuine Johnson supporters who had been encouraging each other to "show your support for Boris Johnson by writing 'I support Boris 100%'... it’s driving Remainers crazy".
A similar thing happened in Spain recently: Hundreds of accounts posted botlike messages as a joke — and were suspended by Twitter for suspicious activity.
It was not possible to contact all the accounts that have been accused of being “bots” for sharing the coronavirus message, although several others had telltale signs of being real people.
For example, many had a name and a photo that matched up with a profile on Facebook or elsewhere on the internet.
These Twitter accounts could be impersonators, but there are more reasons to suggest they are not.
Most were not set up recently to pump out propaganda — they generally had a long history of tweets, often about politics but also discussing more prosaic issues like football or the weather, with the English language skills of a native speaker.
Most of these accounts have very few followers, suggesting their overall impact, bot or not, was limited.
It is impossible to prove that none of these accounts are bots. Some may be.
But we can say one thing with confidence — a lot of the accounts that have been called “bots” in recent weeks are in fact the accounts of ordinary British people.
Twitter has admitted that in the run-up to the 2016 election, more than 50,000 accounts on the site were “potentially connected to a propaganda effort by a Russian government-linked organization known as the Internet Research Agency”.
The company later made many of these tweets public.
Bots are a real concern in the UK too, and there is genuine evidence of bots attempting to influence the online conversation in the run-up to important elections.
But there are a lot of ordinary British people who tweet in a partisan way, often copying and pasting messages from elsewhere on the internet.
Sometimes they do this in a sincere attempt to spread the message; sometimes they do it just to wind people up.
So next time you see claims of “propaganda bots” attempting to influence British politics, remember: The truth may be even weirder.