| Online Safety Act 2021 | |
|---|---|
| | |
| Parliament of Australia | |
| |
| Citation | Online Safety Act 2021 (Cth) (No. 76, 2021) |
| Considered by | House of Representatives |
| Considered by | Senate |
| Legislative history | |
| First chamber: House of Representatives | |
| Introduced by | Paul Fletcher |
| First reading | 24 February 2021 |
| Second reading | 16 March 2021 |
| Third reading | 16 March 2021 |
| Second chamber: Senate | |
| Member(s) in charge | Jane Hume |
| First reading | 17 March 2021 |
| Second reading | 22 March 2021 |
| Third reading | 22 June 2021 |
| Amended by | |
| Online Safety Amendment (Social Media Minimum Age) Act 2024 | |
| Status: Amended | |
The Online Safety Act 2021 is a law passed by the Parliament of Australia with the goal of improving online safety.
This section needs expansion. You can help by adding to it. (December 2025) |
In September 2019, the eSafety Commissioner issued a directive to internet service providers in Australia, requiring them to continue blocking websites hosting the video of the Christchurch mosque shootings. [1]
Until the act, the eSafety Commissioner did not have formal powers enshrined in law. [2]
This section needs expansion. You can help by adding to it. (December 2025) |
The act extends the remit of the eSafety Commissioner to include adult bullying and image-based abuse. [3]
In August 2021, the government opened its consultation on the Online Safety (Basic Online Safety Expectations) Determination 2021, a set of more detailed rules regarding how the law would be implemented in relation to end-to-end encryption on messaging services such as iMessage, WhatsApp, and Signal. Critics of the law argued that it was impossible for tech companies to detect unlawful content in encrypted messages without compromising user privacy or putting users data at risk of being hacked. [4] [5] eSafety Commissioner Julie Inman Grant has maintained the position that end-to-end encryption does not absolve services of responsibility for hosting or facilitating online abuse or the sharing of illegal content, particularly online child exploitation. [6]
In 2024, the government published the Online Safety (Relevant Electronic Services—Class 1A and Class 1B Material) Industry Standard 2024 which contains criteria relating to which content must be removed. [7] Class 1A material typically refers to content relating to child exploitation and pro-terror content. [7] Class 1B material refers to content including extreme violence, content related to the promotion of crime, and content related to illegal drugs. [7]
In 2024, the eSafety Commissioner's office sent a complaint alert to X (formerly Twitter), in regards to a video post by Celine Baumgarten that expressed political opinion and criticism about an LGBTQ+ club at a Melbourne school. The eSafety Comissioner stated in the complaint alert that video was "cyber-abuse material targeted at an Australian adult" (the teacher in the video), and X subsequently geo-blocked the post in Australia. Celine Baumgarten, supported by the Free Speech Union of Australia lodged an application for review with the Administrative Review Tribunal (ART). The eSafety Commissioner argued the "complaint alert" was not a formal removal notice under section 88 of the act and therefore ART had no jurisdiction to review the decision. The tribunal ruled in favour of Baumgarten, finding the commissioner made a reviewable decision. Emilios Kyrou, president of ART, stated "the notice amounts, as a matter of fact, to a removal notice under [the act] regardless of what was subjectively intended by the Commissioner… or whether the notice was legally effective under [the act]". [8]
The act was notably amended by the Online Safety Amendment (Social Media Minimum Age) Act 2024 to restrict the use of social media by people under the age of 16. [9]