This Court Ruling On Facebook Comments Is A Huge Headache For The Media

    Always... vet the comments?

    They say never read the comments.

    But an unexpected court ruling from an Australian judge could mean journalists and media companies have to spend more time trawling the comment section than anyone would advise.

    Dylan Voller, the Aboriginal man who became a household name after footage of his shocking treatment in youth detention was revealed on the Australian national broadcaster, is suing News Corp, Fairfax Media (now Nine), and the owner of Sky News Australia.

    Voller says he was defamed by comments, written by readers, on Facebook posts where each company shared stories about Voller's life and time in custody.

    The case is ongoing — but before it went to trial, a central question needed to be answered: could Voller even sue the media over comments written by other people?

    The media companies argued there was no way they could be responsible for comments they didn't write, or even know about prior to being posted.

    But on Monday, to the shock and surprise of lawyers and journalists alike, New South Wales Supreme Court justice Stephen Rothman found they can be held liable.

    Why was it such a surprise?

    Marque Lawyers partner Hannah Marshall told BuzzFeed News that in the past, intermediary publishers like Facebook tended to become liable only once they had been told about potentially defamatory material and failed to remove it.

    She cited a case in which a woman named Janice Duffy sued Google over defamatory search results.

    "The finding was Google was liable, but only after she contacted them and complained and they determined not to take them down," Marshall said.

    Voller did not contact the media companies and ask them to take down the comments — and the media companies argued at the hearing this meant they weren't liable.

    Rothman rejected that and found the outlets were primary publishers of the comments, because they could have vetted them and they have the power to hide or delete them.

    "The reason for the surprise around this decision is that people really expected the Facebook public pages would be treated in the same way as those other intermediaries," Marshall said.

    The hack

    The central issue is whether or not media companies can vet the comments. There is no official way to turn off comments, or vet them before they are posted, on public Facebook pages.

    But at the hearing in February, social media expert Ryan Shelley proposed a hack that he said would do the trick.

    Administrators of public Facebook pages can apply a filter list that automatically hides comments containing certain words.

    A number of social media managers testified they used such a list to hide comments including profanities and racist slurs.

    Shelley proposes using the list in a comprehensive way — putting the 100 most commonly used words in the English language ("a", "the", etc) on the list, and causing any comment containing those words to be automatically hidden from the public.

    Only the commenter, their Facebook friends and the person running the page would be able to see the comment.

    The hack would automatically hide a large number of comments, and moderators could monitor and unhide them as appropriate, Shelley said.

    At the hearing, Shelley conceded that a one-word post containing a word that is not among the most common — e.g. "criminal" or "rapist" would not be blocked. Nor would picture comments.

    Rothman accepted Shelley's proposal as evidence that media companies could indeed vet comments, as long as they hired sufficient staff.

    The media's reaction

    The ruling was met with opprobrium from Australian media companies.

    Marshall said Rothman's conclusions that media companies can easily vet comments and control their publication "do not necessarily sit comfortably".

    The word filter he relied on could "possibly be circumvented", she said, and news outlets cannot control Facebook comments in the same way they could control, say, letters to the editor.

    "The news outlets have to put the whole comment up or put none of the comment up," she said. "[With a letter to the editor] they can probe the author, assess the defensibility of the piece ... You can also edit it. You can say, we’re not happy with this comment, we’re going to take it out before we publish it.”

    Mark Williams is a solicitor who practices in defamation and an adjunct professor at RMIT. He told BuzzFeed News that Rothman had homed in on the way outlets use Facebook for commercial purposes.

    "What the judgement talks about is the fact that the media outlets are using their Facebook reach as a method of getting to people who wouldn’t otherwise be reading the stories they publish in the hard copy media," he said. "So Facebook is an integral part of driving traffic to their sites.

    "It’s really part of a business model and I think the judge is saying, if you live by the sword you die by the sword."

    Williams advised businesses using public Facebook pages to seriously rethink their social media strategy.

    "I would say any reasonable sized business that is worried about this decision has got to take this seriously," he said.

    "Either think about why they’re on Facebook and how Facebook works, or whether you’d be better off on one-to-one communications or [platforms] that delete instantly."

    What about me?

    When it comes to individual Facebook users — don't worry too much yet about the prospect of being held liable for something your embarrassing relative posts on your status.

    In this case, Marshall said, the same decision in Duffy v Google should apply — that is, you're not liable until you get a warning notice and fail to take it down.

    "That ability to apply filters, trying to catch all the comments and vet them, doesn’t apply to a private Facebook page," she said.

    But people should remain aware they can in fact be personally sued over what they write on social media. Anecdotal evidence from judges and lawyers suggests such disputes, between two people who are not media publishers, are on the rise.

    "For too long, ordinary users of publishing platforms have been taking things a bit too easy when it comes to publishing hurtful or defamatory content," Fernandez said.

    "There seems to be a misconception that they are immune ... but as court decisions, even in Australia, have shown on occasion, they are not out of reach from the long arm of the law."

    The difficulty with suing is often a logistical one, Fernandez said. Sometimes people are anonymous, or they may have no money.

    "People say I won’t bother with the small fry out there, I will go for the party with the bigger pockets ... people sometimes don’t just want the defamatory material removed, they want a whack of money. So the big publishers are sitting ducks."

    The elephant in the room

    Associate professor Joseph Fernandez, a media law academic, told BuzzFeed News that he understood why media companies were unhappy.

    "I share some of their unhappiness," he said. "The tech giants, the social media operators — Twitter, Instagram, Google, Facebook — to a large extent they are getting away, without being held to account, in local scenarios.

    "It’s a very legitimate question to say, hang on, aren’t we ignoring the elephant the room?"

    Fernandez said that one curious part of the ruling was the judge saying that media companies were the primary and only publishers — as opposed to being publishers along with Facebook and the person who actually wrote the comment.

    "That is very questionable. I am not sure what the judge was thinking when he arrived at that position," he said.

    "I would say let’s be realistic, let’s be fair, look at all the people who participated in the publication of this material and consider can any one of them be said to be more responsible, or should carry more liability, or responsibility for having facilitated that."

    What next?

    The case continues.

    Rothman's ruling found that the media companies can be held liable for the comments, but it was a preliminary decision and there is still a trial to come on whether they defamed Voller.

    And his controversial decision could yet be overturned.

    "Now a judge has made a finding that this can be publication, it will almost certainly have to go through the appeal process, possibly all the way up to the High Court," Williams said.

    "More pressure will come from media outlets to governments to say, this is all going mad or getting too hard."

    Marshall said that the way people use Facebook has required media outlets to adopt it in order to make money — and that has left them between a rock and a hard place.

    "On the one hand the news media platforms obviously need to manage their exposure to defamation claims," she said.

    "But on the other hand, if you hide all your comments or switch off comments altogether, because Facebook’s algorithm takes account of comments on a post, you’re going to be at a huge disadvantage.

    “I feel like the legal system and the internet are on this collision path. And what’s going to happen next is really hard to figure.”