Downing Street added to pressure on Google and Facebook on Friday, warning the internet giants that they "can and must do more" to stop hateful and violent material appearing on their platforms.
Theresa May's official spokesperson would not comment on whether there was specific evidence that the internet had played a role in the apparent radicalisation of the Westminster attacker, Khalid Masood.
As a general point, however, he told reporters that the "fight against terrorism and hate speech has to be a joint one" and that the technology companies weren't, in the government's view, pulling their weight.
"Social media companies have a responsibility when it comes to making sure that this material is not disseminated," the Number 10 official said. "And we have been clear repeatedly that we think they can and must do more. And we are always talking with them on how to achieve that."
"The message that we've delivered consistently is that we want them to do more and the ball is now in their court," he added. "Let's see how they respond."
Pressed by journalists as to what the government wanted Google, Facebook, Twitter and other internet platforms to do to stop extremist material appearing online, the spokesperson said: "Clearly we don't want this material to appear in the first place. Beyond that, where it does appear, we want it to be taken down as quickly as is possible."
Asked if there was specific intelligence that the Westminster attacker was inspired by or aided by social media, the spokesperson said no: "I'm talking in broad terms here."
He would not comment on precise details of an ongoing criminal investigation.
Facebook said: “There is absolutely no place for terrorist groups on Facebook and we do not allow content that promotes terrorism on our platform. Whenever we are made aware of this kind of content, we take swift action to remove it from Facebook and work with law enforcement and security agencies as appropriate. We take this responsibility very seriously and continue to work with the government to explore what more can be done to tackle extremism online.”
Google declined to comment.
Downing Street's comments followed a front page report in Friday's Daily Mail that proclaimed Google "The terrorists' friend" because "vile manuals" describing how to carry out violent attacks were easily available on the search engine.
Google and Facebook have been under mounting pressure in recent weeks after an investigation by The Times found that advertisements for big companies were appearing on Google's YouTube video platform alongside clips promoting hateful, racist, or otherwise offensive views.
Numerous large companies have suspended advertising campaigns with Google as the publicity intensified. In Westminster, politicians also piled on the pressure. This month, executives for Google, Facebook, and Twitter were grilled by the UK parliament's home affairs committee for several hours.
Yvette Cooper, the chair, said Google's attempts to police its own standards were a "joke" after the company defended its decision not to take down an anti-Semitic video fronted by the American white nationalist and former Ku Klux Klan leader David Duke.
The Commons culture, media, and sport committee, which is chaired by the Conservative MP Damian Collins, is also looking closely at Google and Facebook.
While the newspapers' reporting has put Google and Facebook under the microscope publicly, lobbyists for Fleet Street have, behind the scenes, been urging the government and regulators to take action to curb the companies' increasing dominance over the news business.
With consumers increasingly getting their news from social media on mobile devices, editors have been replaced by algorithms. Advertising revenues that underpinned traditional publishing businesses are being gobbled up instead by the technology companies.
This month, the News Media Association, the lobbying group for British newspapers, urged the government to tilt the balance back towards the newspaper publishers.
The internet giants have come under fire after terror incidents in the UK before.
In its 2014 report into the murder of British soldier Lee Rigby, parliament's intelligence and security committee accused internet firms of unintentionally "providing a safe haven for terrorists", prompting then prime minister David Cameron to accuse the tech giants of allowing extremists to plot "murder and mayhem".
The criticism came after it emerged that one of the killers, Michael Adebowale, indicated his murderous intent in a post on Facebook but it was not brought to the attention of law enforcement authorities.
The security services, the report found, had not applied for warrants to monitor the two individuals, despite the fact that they had featured in a total of seven MI5 investigations. But while the intelligence agencies escaped with only light criticism for the "errors", the committee said it was "unacceptable" that Facebook did not seem to think it had an obligation to weed out and report to the authorities security threats.
As home secretary, May pushed for the intelligence and law enforcement authorities to have greater powers to pursue suspected terrorists online.
In her speech to the Conservative party's annual conference in September, her first as prime minister, May appeared to put the social media giants on notice, citing "a household name that refuses to work with the authorities even to fight terrorism" as the sort of corporate behaviour she would not tolerate.
Alex Spence is a senior political correspondent for BuzzFeed News and is based in London.
Contact Alex Spence at email@example.com.
Got a confidential tip? Submit it here.