The Times Australia
Fisher and Paykel Appliances
The Times News

.

require social media companies to bear a duty of care to users

  • Written by Katharine Gelber, Professor of Politics and Public Policy, The University of Queensland
require social media companies to bear a duty of care to users

Hate speech is proliferating online and governments, regulators and social media companies are struggling to keep pace with their efforts to combat it.

Just this week, the racist abuse of Black English football players on Facebook and Twitter has brought the issue[1] to the forefront and shown how slow and ineffective the tech companies have been in trying to control it and the urgent need for stronger laws.

Australia’s piecemeal approach

In Australia, the regulation of these kinds of harmful online practices is still in its infancy.

In February, a digital industry body drafted an Australian code of conduct on disinformation and misinformation[2], which most of the large tech companies have adopted.

However, this self-regulatory approach has been criticised by community organisations[3] for being voluntary and opt-in. Some have argued the code sets its threshold[4] too high by requiring an “imminent and serious” threat of one of the identified harms occurring. This could render it ineffectual.

The Australian Communications and Media Authority has been tasked[5] with reporting on the efficacy of the code, which is due soon.

Read more: 6 actions Australia's government can take right now to target online racism[6]

Australia also has the “Safety by Design[7]” framework, developed by the eSafety commissioner. This is another voluntary code of practice which encourages technology companies to mitigate risk in the way they design their products.

In late June, the federal parliament also enacted a new Online Safety Act[8]. This legislation was developed in the aftermath of the live-streamed Christchurch massacre. It regulates some relevant types of harm, such as the cyber bullying of children and livestream broadcasts that could promote or incite extreme violence.

The Act creates a complaints-based system for removing harmful material, and in some cases the eSafety commissioner has the power to block sites. It also has a wide remit in terms of its coverage of a variety of online services. Yet, it only tackles a few specific types of harm, not the full spectrum of harmful speech.

There is also an inquiry[9] into extremist movements and radicalisation currently under way by the Parliamentary Joint Committee on Intelligence and Security (PJCIS). It is tasked with considering steps the federal government could take to disrupt and deter hate speech, terrorism and extremism online, as well as the role of social media and the internet in allowing extremists to organise.

The inquiry was due to present its report to the home affairs minister in April, and it is now overdue.

A far deeper problem

These moves are a step in the right direction, but they still attempt to tackle each specific type of hate speech as a discrete problem.

ASIO[10] has recently pointed to the problem of hateful echo chambers online. And Australian researchers[11] have highlighted how right-wing extremists routinely dehumanise Muslims, Jews and immigrants as a way of coalescing support behind radical worldviews and socialise people towards violent responses.

On mainstream platforms, this is achieved through a relentless diet of distorted news that supports conspiracy theories and the “othering” of marginalised communities.

Read more: Facebook's failure to pay attention to non-English languages is allowing hate speech to flourish[12]

We know existing laws to tackle these types of hate speech and misinformation are inadequate.

For example, we have civil laws against discrimination and hate speech, but they rely on victims to bring legal action themselves. Members of targeted communities can be deeply fearful of the repercussions of pursuing legal action and it can result in an enormous personal cost.

What other countries are doing

Governments around the world are struggling with this problem, too. In New Zealand, for instance, there has been considerable debate[13] over hate speech law reform, particularly whether there must be a clear link to violence before hate speech regulation can be justified.

Germany has enacted one of the toughest laws[14] against online hate speech, imposing fines of up to €50 million for social media companies that fail to delete “evidently unlawful” content. Civil rights advocates, however, argue that it encroaches on freedom of expression[15].

France, too, passed a law last year that would have required online platforms to take down hateful content flagged by users within 24 hours, but a court struck down this provision[16] on the ground that it infringed on freedom of expression in a way that was not necessary, suitable, and proportionate.

A potential new model in the UK

There is another, more comprehensive approach in the United Kingdom that could provide a path forward.

The Carnegie Trust has developed a proposal to introduce a statutory duty of care[17] in response to online harms. Similar to how we require the builders of roads, buildings or bridges to exercise a duty of care to the people using them, the idea is that social media companies should be required to address the harms their platforms can cause users.

The UK government incorporated this idea into its Online Safety Bill[18], which was just released in May for public discussion. Trumpeted as a “new framework to tackle harmful content online”, the mammoth legislation (which runs to 145 pages) is framed around duties of care.

There are still some concerns. The Carnegie Trust itself has critiqued[19] a number of aspects of the bill. And the powers given to the culture secretary are of particular concern[20] to free speech advocates.

Despite these concerns, there is much to be said for the overall approach being pursued. First, the legislation sits within the existing framework of negligence law, in which businesses owe a duty of care, or responsibility, to the general public who use the facilities they create and enable.

Second, it places the burden of responsibility on the social media companies to protect people from the harm that could be caused by their products. This is a better approach than the government penalising social media companies after the fact for hosting illegal or harmful content (such as happens under the German law), or requiring an eSafety commissioner to do the heavy lifting on regulation.

Read more: Will the government’s online safety laws for social media come at the cost of free speech?[21]

Most importantly, this approach allows for broad coverage of existing — and emerging — types of online harm in a rapidly changing environment. For example, online speech that constitutes a threat to the democratic process would fall under the new law.

While the detail of the UK bill will no doubt be debated in coming months, it presents an opportunity to effectively tackle a problem that many agree is growing in scale and volume, yet is simultaneously very difficult to address. A statutory duty of care may be just what is needed.

Rita Jabri Markwell, an advisor to the Australian Muslim Advocacy Network (AMAN), contributed to this article. The civil society organisation has been monitoring online hatred and engaging directly with Facebook, Twitter and the Global Internet Forum to Counter Terrorism.

References

  1. ^ brought the issue (www.cnbc.com)
  2. ^ Australian code of conduct on disinformation and misinformation (digi.org.au)
  3. ^ community organisations (au.reset.tech)
  4. ^ threshold (www.aman.net.au)
  5. ^ tasked (www.acma.gov.au)
  6. ^ 6 actions Australia's government can take right now to target online racism (theconversation.com)
  7. ^ Safety by Design (www.esafety.gov.au)
  8. ^ Online Safety Act (www.aph.gov.au)
  9. ^ inquiry (www.aph.gov.au)
  10. ^ ASIO (www.asio.gov.au)
  11. ^ Australian researchers (zenodo.org)
  12. ^ Facebook's failure to pay attention to non-English languages is allowing hate speech to flourish (theconversation.com)
  13. ^ considerable debate (theconversation.com)
  14. ^ one of the toughest laws (www.telegraph.co.uk)
  15. ^ encroaches on freedom of expression (www.politico.eu)
  16. ^ court struck down this provision (www.nytimes.com)
  17. ^ statutory duty of care (www.carnegieuktrust.org.uk)
  18. ^ Online Safety Bill (www.gov.uk)
  19. ^ has critiqued (www.carnegieuktrust.org.uk)
  20. ^ particular concern (www.politico.eu)
  21. ^ Will the government’s online safety laws for social media come at the cost of free speech? (theconversation.com)

Read more https://theconversation.com/a-better-way-to-regulate-online-hate-speech-require-social-media-companies-to-bear-a-duty-of-care-to-users-163808

Times Magazine

This Christmas, Give the Navman Gift That Never Stops Giving – Safety

Protect your loved one’s drives with a Navman Dash Cam.  This Christmas don’t just give – prote...

Yoto now available in Kmart and The Memo, bringing screen-free storytelling to Australian families

Yoto, the kids’ audio platform inspiring creativity and imagination around the world, has launched i...

Kool Car Hire

Turn Your Four-Wheeled Showstopper into Profit (and Stardom) Have you ever found yourself stand...

EV ‘charging deserts’ in regional Australia are slowing the shift to clean transport

If you live in a big city, finding a charger for your electric vehicle (EV) isn’t hard. But driv...

How to Reduce Eye Strain When Using an Extra Screen

Many professionals say two screens are better than one. And they're not wrong! A second screen mak...

Is AI really coming for our jobs and wages? Past predictions of a ‘robot apocalypse’ offer some clues

The robots were taking our jobs – or so we were told over a decade ago. The same warnings are ...

The Times Features

What’s been happening on the Australian stock market today

What moved, why it moved and what to watch going forward. 📉 Market overview The benchmark S&am...

The NDIS shifts almost $27m a year in mental health costs alone, our new study suggests

The National Disability Insurance Scheme (NDIS) was set up in 2013[1] to help Australians with...

Why Australia Is Ditching “Gym Hop Culture” — And Choosing Fitstop Instead

As Australians rethink what fitness actually means going into the new year, a clear shift is emergin...

Everyday Radiance: Bevilles’ Timeless Take on Versatile Jewellery

There’s an undeniable magic in contrast — the way gold catches the light while silver cools it down...

From The Stage to Spotify, Stanhope singer Alyssa Delpopolo Reveals Her Meteoric Rise

When local singer Alyssa Delpopolo was crowned winner of The Voice last week, the cheers were louder...

How healthy are the hundreds of confectionery options and soft drinks

Walk into any big Australian supermarket and the first thing that hits you isn’t the smell of fr...

The Top Six Issues Australians Are Thinking About Today

Australia in 2025 is navigating one of the most unsettled periods in recent memory. Economic pre...

How Net Zero Will Adversely Change How We Live — and Why the Coalition’s Abandonment of That Aspiration Could Be Beneficial

The drive toward net zero emissions by 2050 has become one of the most defining political, socia...

Menulog is closing in Australia. Could food delivery soon cost more?

It’s been a rocky road for Australia’s food delivery sector. Over the past decade, major platfor...