The objects of the Commonwealth Online Safety Act2021 (OS Act) are (a) to improve and (b) to promote – online safety for all Australians. When regard is had to what Australia's eSafety Commissioner, Ms Julie Inman Grant, says on her official web site about her statutory role, her concern for making children (excluding the ideologically convenient elastic descriptor "young persons") "safe from online harm" is one shared by all right-thinking persons.
However, the OS Act (which was preceded by the Enhancing Online Safety Act 2015) is not confined to child protection. And what the Commissioner has been saying about the need to keep adults "safe from online harm" is controversial, not least because of the glaring ambiguity of abstractions including "safe", "safety", "harm", "inappropriate content" and "transparency", the powers that the Commissioner has under the OS Act to censor online content, and her support for limiting freedom of speech.
At the recent annual World Economic Forum (WEF) meeting at Davos, Switzerland, the Commissioner participated in a panel discussion on "Ushering in a Safer Digital Future" in which she made the following cautionary comment:
Advertisement
"We are finding ourselves in a place we have increasing polarization everywhere. And everything feels binary when it doesn't need to be. So, I think we're going to have to think about a recalibration of a whole range of human rights that are playing out online from freedom of speech to the freedom to be, you know, to be free from online violence or the right of data protection to the right to child dignity."
The Commissioner has described her over-arching statutory role as a manifestation of a broad socio-political "Safety by Design" quest – a quartet of abstractions: "the four Rs of the digital age are respect, resilience, responsibility and recovery.". Nowadays, the word "respect" is increasingly used to insist that we must respect ideas which we may reject. This quest is one being undertaken in co-operation with and for the Big Tech companies. That creed fits in seamlessly with the WEF and its "Mission", "Governance", and "Partners".
As the world's first regulatory authority established to tackle existing and future online harms, Australia's eSafety Commissioner has a seat at the WEF table shaping governance of the coming extended reality world, working alongside 60 of the world's largest tech providers.
The common law of torts with occasional supplementary statutory reforms contains very specific tests for liability for personal injury. They are embodied in three inter-related concepts - reasonable foreseeability of physical and psychiatric harm based on risk assessment, the specification of discrete factors governing the standard of care, and limitations on recoverability of remote damage.
The "harm" to which the OS Act applies in the case of adults, exhibits what might be regarded as a superficially similar quality in, for example, the definition in s 7 of "cyber-abuse material targeted at an Australian adult"which includes these words:
"... (b) an ordinary reasonable person would conclude that it is likely that the material was intended to have an effect of causing serious harm to a particular Australian adult; (and)
Advertisement
(c) an ordinary reasonable person in the position of the Australian adult would regard the material as being, in all the circumstances, menacing, harassing or offensive..."
However, the OS Act is directed at a categorically different form of alleged "harm". It is an emanation of the modern confected ideology that there are groups of people who, by reason of some characteristic innate or chosen, are entitled to privileged protection because the group members are, without a single exception, "vulnerable", "powerless". "marginalised", "dehumanised", "demonised", and other insulting generalizations.
The Commissioner has the power to give a removal notice to the provider of a social media service, relevant electronic service or designated internet service. Section 45 of the OS Act provides for the responsible federal Minister for Communications, Urban Infrastructure, Cities and the Arts to make, by legislative instrument, a determination setting out "basic online safety expectations for a social media service, relevant electronic service or designated internet service." (my emphasis) In the context of the OS Act, the Commissioner's recent claim to the WEF attendees at Davos that there is too much "polarization" can only be taken to mean that there is too much online "controversy" and debate, and that it has to be strictly controlled for "Safety's" sake. Thus, when regard is had to her web site and her use of the abstraction recalibration, her Office's clear position is that Australia needs more censorship legislation: