Dylan Voller Is Suing Media Companies Over Facebook Comments From Readers

    Voller, who spent his teenage years in and out of juvenile detention, is suing three big media companies for defamation: News Corp, Fairfax Media (now Nine), and the Australian News Channel.

    Dylan Voller rose to prominence in terrible circumstances. After a troubled childhood, the Aboriginal youth spent his teenage years in and out of juvenile detention, incarcerated at the infamous Don Dale youth prison in the Northern Territory, Australia.

    In July 2016 shocking CCTV footage of Voller’s time behind bars was aired on the ABC’s investigative current affairs program Four Corners. One video showed 13-year-old Voller thrown onto a mattress; another at age 14 being struck in the face by an officer after a minor misbehaviour.

    But one image in particular lingered in the national consciousness: a 17-year-old Voller, restrained and wearing a spit hood, as he was left alone for two hours in an adult prison. The damning footage sparked a royal commission into youth detention in the Northern Territory, and Voller has been in and out of the news ever since.

    Now 21, he is suing three big media companies for defamation. The companies — News Corp, Fairfax Media (now Nine), and the Australian News Channel (ANC) — posted news articles and video clips about Voller on various official Facebook pages in 2016 and 2017. These posts related to the royal commission, Voller’s time in custody, and even his poetry.

    But it’s not the posts, articles or video clips Voller is suing over: it's the Facebook comments.

    Voller claims a number of comments on the post defamed him by falsely suggesting, among other things, that he "savagely bashed" a Salvation Army officer, causing him serious injury, and that he is a rapist. These comments were written by readers.

    The circumstances of the case — in which Voller does not allege the media companies knew about the comments, but rather that they ought have known — are unique and could have major implications for how the media operates on social media.

    And that brings us to the NSW Supreme Court, where over Wednesday, Thursday and Friday, lawyers have been duking it out over whether the media companies can be held liable. Did the three companies, legally speaking, "publish" the comments?

    Voller says yes. His case argues that the media companies chose to have a Facebook page and chose to post articles about him on it; they invited readers to comment; and they should have known there was a "significant risk" of defamatory comments. He also argues the companies had the power to remove and hide comments, and could have prevented and monitored comments, but did not do so.

    The media companies say no. They stressed they had no knowledge the comments had been posted — "no phone call, email, letter, notice of concerns, anything of the like" — until the lawsuit was filed. The companies argue that Voller doesn’t suggest they knew about the comments, or had failed to take them down after being asked to — and with no knowledge and no notice, they can’t be held liable for publishing them.

    Voller’s barrister Tom Molomby SC declared it was "a very novel case", while James Hmelnitsky SC, acting for the three media companies, opted for "crunchy and interesting, from a defamation point of view".

    Just to set the scene of what it can be like when lawyers discuss social media, here is Molomby explaining the concept of clicking a link: "Those snippets or extracts [posted on the Facebook page] have what’s called a preview hyperlink attached to them. The user of Facebook can read the extract, or play the snippet where it’s a video, and the user can choose to go further and activate the preview hyperlink, which takes the user to the full video or the original article which the person can obviously read."

    Thankfully, there is a degree of self awareness. Early in the hearing Justice Stephen Rothman joked: "Remind me Mr Molomby — Facebook is something on the internet, is it?"

    Molomby replied that as far as his acquaintance with the social network went, "I would describe myself as someone in the sandpit with the alphabet building blocks."

    Rothman later added: "Surprisingly given my position as a judge of the court, I do know how Facebook works. The court has a Facebook page I think, but I’ve never liked it."

    The first witness was a man who well and truly knows how Facebook works: Ryan Shelley, a social media consultant who wrote an expert report for Voller’s case about comment monitoring on Facebook pages.

    Shelley told the court that on public business pages, like those run by the media companies, you cannot turn off the comment function. But, he added, you can deploy a "hack" that effectively does the same thing.

    Shelley’s hack involves putting 100 of the most commonly used words in the English language ("a", "the", etc) on a Facebook filter list, causing any comment containing those words to be automatically hidden from the public.

    Only the commenter, their Facebook friends and the person running the page would be able to see the comment. The hack would automatically hide a large number of comments, and moderators could monitor and unhide them as appropriate, Shelley said.

    Shelley conceded that a one word post such as "criminal" or "rapist" would not be blocked using this strategy. Picture comments would also slide through.

    "So in all those cases of someone posting a single word that is not on the list, a single picture, this hack does nothing to stop it?" Hmelnitsky asked.

    "That is correct," Shelley replied.

    The hack also can’t be turned on for just one post: "When it’s on it’s on, when it’s off it’s off," Shelley said.

    Shelley also suggested companies could use random sampling to monitor for problematic comments. Instead of checking every comment, which would generally take a lot of time, companies could check every 2nd, 10th, 50th, etc.

    It’s not perfect, Shelley said, but if a couple of comments were potentially defamatory, it could signal a larger problem and call for certain posts to be monitored more closely: "When there’s smoke there’s fire."

    In response to Shelley, the media companies put a series of social media editors on the stand to explain how they actually run various Facebook pages — and why the hack, in their view, is not viable.

    Timothy Love, the head of digital at ANC, is responsible for the Sky News Australia and Bolt Report Facebook pages. He told the court he regularly hides comments that are offensive or discriminatory, so that only the commenter and their friends can see it.

    ANC also uses a filter list of about 150 words — profanities, commonly misspelled variations of profanities, and racist slurs such as the n-word — to automatically hide comments containing those words, Love said.

    The court heard that Facebook already has a profanity filter that can be turned on or off, but some media outlets maintain a list of their own as well.

    Love told the court he rarely blocked users from commenting altogether, but had resorted to this a few times, particularly during the postal survey on same-sex marriage: "There was some users who were very passionate about that and made their thoughts known in ways we thought were unacceptable."

    Brighette Ryan, digital night editor at The Australian, agreed that she rarely blocked people outright, offering this bleak explanation: "The necessity to block users isn’t really there, because if they are abusing another user normally they might be swearing, so the comment would be hidden anyway."

    Ryan said Shelley’s hack would probably work in getting rid of most comments, but said "it wouldn’t make sense" as it would involve someone sitting and unhiding comments all day.

    In response, Molomby offered the radical — possibly even utopian, to some — suggestion of a world with no comments section.

    "I rather meant it as, you would never have any comments," he said to Ryan. "You would be on the Facebook page with the snippet of the article … but there just wouldn’t be any comments there. Ever. Unless some slip through, and the few that slip through would be dealt with fairly easily because there would be so few and they’d be fairly short. Why couldn’t that be done?"

    "It’s not the way … I imagine our users would get frustrated," Ryan said. "People go onto Facebook to comment. And if comments weren’t appearing it would defeat the purpose of the page, one of the purposes of the page."

    "So as News [Corp] sees it, for users, the ability to comment is really important?"


    Molomby also asked each editor specific questions about Voller: Did they know he was an Aboriginal youth? Did they know he had been in and out of detention? Did they know he was a vulnerable person, that he had limited financial resources? And had their organisation considered the kinds of things people might write about him on Facebook?

    Love said ANC gave no consideration to what sorts of comments about Voller might be triggered by the things they posted on Facebook.

    Ryan and Sophia Phan, social media editor at Fairfax, both said they were unable to say if their organisations gave it any thought.

    The only one who answered yes was former Centralian Advocate editor Carl Pfeiffer, who said there would have been "some consideration" of the kind of comments that would likely be made about Voller if they posted stories about him on Facebook.

    "We did know that Dylan Voller was a very emotive subject. There was a lot of debate going on in the Alice Springs community about him," Pfeiffer said.

    Pfeiffer said he knew some people would be sympathetic to Voller, and others would be hostile.

    "It wouldn’t be a novelty to you at all that there would be people who would make very unfair criticisms of someone like Mr Voller?" Molomby asked.

    "Yes," Pfeiffer agreed.

    The novel nature of the hearing led to a number of analogies and hypotheticals, as the lawyers and Justice Rothman tried to apply existing caselaw to the Facebook comment dilemma.

    "You could have graffiti on the site of a local council wall that you would never suggest was a publication by the council, even though the council owns the wall and may be aware of the graffiti," Rothman said at one point.

    Rothman also asked Ryan: "What if you’ve got for example the head of the Ku Klux Klan in New South Wales, and that person habitually posted racist material [on The Australian’s Facebook posts], none of which contained profanities?"

    Her response: "I imagine if we did see that being posted, that would be blocked."

    Justice Rothman has reserved his decision.