Australia Says Facebook Took Too Long To Take Down The Christchurch Video. Now The Government Has Made It A Crime

    The new "Sharing of Abhorrent Violent Material" law could mean jail time or massive fines if social media companies don't take down violent videos quickly.

    Australia's parliament has passed a law that could send tech executives to jail for failing to remove violent videos quickly enough.

    The law requires social media sites, internet service providers and other content and hosting services to remove violent videos and livestreams "expeditiously", and to tip off the Australian Federal Police when they know violent material can be accessed in Australia.

    The government introduced the laws in the wake of the Christchurch shootings in mid-March, where a gunman opened fire on two mosques, livestreaming it to Facebook with a GoPro. The video was shared on Facebook and then spread to other platforms including YouTube and Twitter.

    The government has criticised Facebook for taking too long — just over an hour — to remove the Christchurch shooter's video.

    "It was totally unreasonable that it should exist on their site for well over an hour without them taking any action whatsoever," attorney-general Christian Porter said in an interview on Thursday.

    The Criminal Code Amendment (Sharing of Abhorrent Violent Material) Bill passed the House of Representatives on Thursday after the Senate waved it through on Wednesday. The law drew support from the government and Labor opposition, despite the shadow attorney-general saying Labor had "serious concerns" it was "poorly drafted" and would not achieve its purpose.

    The penalties for failing to remove content are a sentence of up to three years in prison or a fine for an individual, or for a company up to 10% of its annual turnover. The government says the law's reference to annual turnover means a company's global turnover – which in Facebook's case was 55.8 billion USD in 2018 – although the Law Council of Australia says it the law is unclear.

    The law targets "abhorrent violent material" — video, audio or images depicting rape, torture, murder and attempted murder, terrorism or kidnap. The material must have been created by a person who was involved in the violence themselves.

    Asked whether CEOs like Mark Zuckerberg could end up criminally liable, Porter said in an interview on Thursday that it would depend on the circumstances but that they could be if they were "very deeply connected" with their platform. Either way, if the events similar to those following Christchurch happened again, Facebook could face a fine of 10% of its annual turnover, he said.

    The law does not specify how quickly companies have to remove the content: the requirement is that they do so "expeditiously".

    Facebook said it was alerted to the Christchurch video 29 minutes after it started, and it was viewed 4,000 times before being removed. By the time Facebook had been tipped off, the video had already been uploaded to 8chan.

    "In the first 24 hours, we removed about 1.5 million videos of the attack globally," Facebook VP and deputy general counsel Chris Sonderby said in a statement after the shooting. "More than 1.2 million of those videos were blocked at upload, and were therefore prevented from being seen on our services."

    The law has drawn criticism for the speed with which it was devised and voted into law, the uncertainty over how it would apply, and its possible impact on journalism.

    Digital Rights Watch director Tom Sulston told BuzzFeed News that while he did not support streaming of hateful content on the internet and agreed it should be censored, the government's approach was a "kneejerk reaction".

    Sulston said the law was aimed at a small amount of extreme content but could jeopardise other benign or important content. He pointed to videos of police shootings in the United States. "Those might fall under this act because they literally feature someone being murdered. That means we do not have accountability over violent policemen," he said.

    "The internet is big, billions of people make Facebook posts and tweets every day. What this bill is doing is threatening all of that other content while targeting this one problem. We have to be careful because the internet is so much more than just that one thing."

    Sulston also said it could be technically difficult for large social media platforms to take down violent content quickly. Computers are "stupid and literal", he said, and automated processes can struggle to tell the difference between a violent fictional film and footage of actual murder.

    "You need to be a human to make a lot of those very difficult decisions because computers cannot tell the difference between a made-up shooting on 24 and a real one."

    The government has insisted media outlets need not fear the law. Journalists creating news reports "in the public interest" have an exemption from the law. In an op-ed published on Friday, Porter and communications minister Mitch Fifield said they were not aware of any Australian outlets who had engaged in conduct that would contravene the laws.

    "A free press is a fundamental pillar of a strong democracy, but so is the principle that the law should apply equally to all," they wrote.

    The law is the latest in a series of controversial pieces of legislation impacting on technology introduced by the government and supported by the opposition. Technology companies have threatened to leave Australia because of the laws.

    In Dec. 2018 the parliament passed a law that could force tech companies to break encryption on chats. Labor voted for the law despite a number of MPs saying it was flawed.

    And in 2015, the parliament made it compulsory for service providers to hold on to communications metadata for two years.

    These laws showed that the government did not understand technology, Sulston said.

    "No-one expects the government to be experts on everything and that's why they should consult with the community and experts to develop policy that is reflective of not just community sentiment, but also practical realities and expert opinion," Sulston said.

    "The fact that this legislation was rushed through at great haste with no consultation really underlies the fact that it is technically poor and very vague."

    Porter said in an interview on Thursday that prime minister Scott Morrison would take the social media laws and the livestreaming issue to the G20, pushing for a global response.