Anyone who hosts a social media page or forum and allows others to post to it should remain on high alert of the risk of being sued for defamation and facing substantial damages claims. That risk exists, even where the host has not posted or authorised the publication and may be unaware of any defamatory comment.
There has been a spate of recent cases which show the increasing trend of social media posts that contain defamatory material and, unfortunately, that doesn’t look like abating any time soon. Those cases serve as a warning to those organisations/individuals who not only post comments but importantly also to those who provide a forum for comments to be published.
Last year we reported on the High Court’s decision in the Fairfax Media Publications v Voller, a landmark case in which the Court found that anyone hosting a social media page could be liable in defamation for comments posted to their page by third parties. The Court reasoned that anyone hosting a social media page is providing others with a forum (to defame) and was therefore equally liable, with the individual who posted the comment, for any defamatory comments (posted on their page).
The Court considered that the host of the page would be liable, even if the page/forum was monitored and regardless of whether any comments (of concern) were removed promptly. That is consistent with the difficulty of trying to erase material from the internet, once it has been published and the Court appeared to acknowledge the ease with which material can be republished, liked, forwarded or retweeted.
Following the handing down of the decision in Voller, the former Morrison government moved swiftly to propose anti-trolling legislation which was intended to absolve the hosts or administrators of social media pages from liability. Specifically, it was proposed that liability could be avoided in the event that hosts maintained a register of contributors and those accessing the page, to disclose the details of those users in the event that any publication was found to be defamatory. However, with the change of Government, that legislation has not been passed and Voller remains the relevant authority. Accordingly, the current law remains that anyone posting defamatory comments and the hosts of social media pages may still be liable as publishers of the defamation.
There has been no recent abatement of the prolixity of social media users or of commentators who are intent on posting defamatory material to various social media sites. In the last couple of weeks, we have had the media and publicity circus otherwise known as the Johnny Depp and Amber Heard defamation trial conclude with an order for damages which only a US jury is capable of finding.
We have also seen former New South Wales Deputy Premier John Barilaro succeed in a claim against Google in the Federal Court for material posted to its YouTube platform by Jordan Shanks (posting as friendly jordies) which the Court considered to be “a relentless, racist, abusive and defamatory campaign conducted on YouTube, a platform operated by Google”. See Barilaro v Google LLC  FCA 650. The Court acknowledged the widespread nature of the publication, noting that YouTube is the second most visited website in the world … after Google. The defamatory material posted to YouTube is largely attributed to Mr Barilaro’s retirement from politics, having formally resigned from public office on 5 October 2021.
It was also notable that the relevant pages encouraged viewers to post comments, usually at the conclusion of a video, which thousands of people did. In ordering that Google pay Mr Barilaro $715,000 (plus his costs which are yet to be determined), the Court considered Google to have become a publisher of the defamatory comments and noted a number of adverse findings in relation to their conduct, as well as in relation to the relative weakness (or lack of reasonable) defence. In those circumstances, Google’s conduct in allowing and maintaining access to the video posts made it just as liable as the original creator/publisher of the defamatory material.
The judgement is still in the appeal period, so there may be more to come on that decision.
Schools and sporting bodies/clubs are examples of organisations that endeavour to foster community through communication and engagement with stakeholders via social media. However, they have also been highly prone to the posting of defamatory comments.
Sometimes the comments are posted to a page or platform hosted by the organisation, (or maintained with its approval), such as those which are established for parents and friends associations, past students or former players pages.
Recent media reports (which appear to have become increasingly frequent) have highlighted the concern and distress that teachers are exposed to not only in the classroom but online as a result of comments posted about them in a personal or professional capacity, or about their school. Of course, the school itself may face reputational damage, either directly or indirectly through being targeted or seemingly embroiled in (online) disputes.
Recent damages claims have exceeded $100,000 for comments which have been found to have defamed teachers. Some examples of comments which have been found to be defamatory have included:
- We don’t want [principal] back at [school] at all’ together with assertions of corruption and bullying;
- [principal] is ‘an evil, nasty horrible woman’ who had a ‘horrendous attitude’.
In the latter (well publicised) case, eight parents were sued by a school principal after posting material to a private Facebook page following a parent-teacher night, when the material became more public.
For those reasons, schools (and other organisations) are strongly encouraged to closely monitor any social media pages and to take swift action against unsavoury comments and those who post them. Those matters can be further managed by taking steps such as altering settings on web pages and implementing strong policies or agreements and codes of conduct. In addition, organisations should be assessing their trade marks and considering whether those can assist or are sufficient to assist in taking action against the misuse of the organisation’s name, including where there may be a need to prevent potential reputational loss. Registered trade marks can be a very useful tool to assist in addressing that conduct.
Given that many organisations have shifted to engaging with stakeholders through the use of social media, it remains essential to not only develop a social media strategy and framework to address the risks mentioned above but to ensure that it is understood and adhered to within an organisation. It is equally important to keep abreast of changes in technology. That may include reviewing rules of access to or participation on a social media page and/or refining the settings which are used in relation to those forums. It is notable that Facebook, in particular, has altered some of its settings since the commencement of the case. Social media hosts have a lot to consider in the current environment and will need to pay close attention to their settings and any changes which are made in the future.