Australia

Open letter calling for stronger measures in the Online Safety Act to protect children from sexual abuse or exploitation  

Dear Minister Rowland, 

We write to you as Australian child protection and anti-slavery practitioners to advocate for stronger protections against online sexual exploitation and abuse of children in the Online Safety Act, which is currently under review.

We commend the Australian Government for bringing forward the statutory review of the Online Safety Act (OSA) to address new and emerging harms and for establishing a Joint Select Committee to inquire into and report on the influence and impacts of social media on Australian society. 

We urge you to table the findings of the upcoming Rickard Review swiftly and to commit to introducing legislation amending the Online Safety Act in 2024 that requires technology companies operating in Australia to take stronger measures to prevent the production and distribution of child sexual abuse material on their platforms and services, including through livestreamed video. 

The Urgency to Act  

According to one authoritative source, 13% of children in the Australian and Pacific region have been subjected to online sexual solicitation in the last 12 months.1 Certain social media sites, such as Only Fans, appear to be used to disseminate child sexual abuse material, sometimes with the involvement of parents and guardians. (See e.g. ABC Four Corners, Kidfluencers, May 2024.)   

Australian men appear to have high rates of participation in online child sexual exploitation (when compared to our global peers), and social media appears to exacerbate this. A nationally representative survey of almost 2,000 Australian men found that 7.5% of men reported having engaged in online sexual offending against children.2 Men with sexual feelings towards children who had offended against children were avid users of youth-focused social media platforms. Compared to other men in Australia, this group was:  

  • 4.00 times more likely to use YouTube,   
  • 4.31 times more likely to use Instagram,   
  • 2.52 times more likely to use Snapchat,   
  • 1.60 times more likely to use Facebook Messenger.   

They are particularly likely to gravitate towards encrypted platforms, which are more difficult to police. Compared to other men in Australia, men with a sexual interest in children are:  

  • 10.01 times more likely to use Telegram,  
  • 4.99 times more likely to use Signal,   
  • 3.06 times more likely to use WhatsApp,  
  • 3.01 times more likely to use Skype.   

In the case of livestreamed child sexual abuse, vulnerable children who may not be technology end users themselves experience repeated hands-on sexual abuse at the hands of a trusted adult, which is livestreamed to paying sex offenders around the world, including in Australia. The Australian Institute for Criminology (AIC) found that popular video call platforms such as Facebook Messenger and Skype have been used by Australian men to view the livestreamed sexual abuse of children in vulnerable countries such as the Philippines.1 

Yet live video child sexual abuse or exploitation is not a crime restricted to developing countries like the Philippines – Australian children are also victims. According to the Australian Centre to Counter Child Exploitation, “Australian children as young as eight are being coerced into performing live-streamed sexual acts by online predators, who often record and share the videos on the dark net and sexually extort victims into producing even more graphic content.”5 In her first transparency report under the Basic Online Safety Expectations, the eSafety Commissioner noted from tech companies’ responses to the questions that “the providers are neither taking action to detect [child sexual exploitation and abuse] CSEA in livestreams (insofar as any of these could be regarded as livestreaming services) or taking action to detect CSEA in video calls or conferences.”2 

What We’re Asking For 

Specifically, we recommend the following measures be adopted: 

1. Create an Enforceable Duty of Care Framework 

Under the current OSA framework, industry codes and standards set out requirements for digital service providers to address specific types of harmful content and their removal, while the BOSE (Basic Online Safety Expectations) define the minimum safety standards that online service providers are expected to meet, including measures to proactively safeguard against abusive conduct and harmful content,. However, the BOSE is voluntary and the eSafety Commissioner is not empowered to enforce these expectations.  

To ensure a systemic and preventative focus that addresses the whole range of online risks, we recommend that the OSA be underpinned by an enforceable duty of care framework, placing responsibility on digital service providers to prevent harm from the use of their platform or services and not just focus on harmful content. Such a duty of care aligns with the responsibility of businesses to prevent, mitigate and remedy adverse human rights impacts with which they may be involved, as set out under the UN Guiding Principles on Business and Human Rights and Australia’s obligations under the UN Convention on the Rights of the Child to give primary consideration to the best interests of the child in all action concerning children. 

The duty of care framework should include a general and overarching framework of expectations for service providers to develop safe platforms, outline specific requirements to prevent the distribution of child sexual abuse material, and take active measures to amend high-risk features and service functions. The duty of care and online safety obligations should apply equally to all digital services, regardless of the reach of the service or apparent level of risk.  

Specific obligations under a duty of care could include: 

  1. Requiring service providers to actively prevent illegal content from entering their platforms, 
  1. Conducting child safety risk assessments for all services, 
  1. Taking reasonable steps to mitigate those risks and reporting on measures companies have taken, 
  1. Continually monitoring the effectiveness of mitigation measures, and  
  1. Establishing a publicly available register of risks and risk profiles.  

2. Broaden the Basic Online Safety Expectations 

A strengthened OSA should include enforceable Basic Online Safety Expectations (BOSE) that are broadened to include device manufacturers and companies producing operating systems, requiring them to undertake safety by design when developing new products and services. As social media services and other electronic services are accessed by devices and run on operating systems, embedding safety by design further up the tech stack will prevent online harms irrespective of the platform or app in use. 

Expanding the BOSE to include equipment providers and relevant operating systems (e.g. iOS and Android) would ensure scaling of online safety and enable keeping up with the changing online environment as new apps and platforms come online daily. The leading device manufacturers and operating system managers (Apple, Google, and Microsoft) have no shortage of resources to build and deploy such child sexual abuse material (CSAM) prevention technology on their products and services by creating CSAM-non-compatible technology.  

AI and machine learning technology already exists to detect and block CSAM on-device, including through video livestreams, without compromising user privacy. Such technologies already on-market include SafetoNet’s HarmBlock3, Cyacomb Safety4, and DragonflAI5, which operate with near-perfect levels of identification of CSAM . 

Tech companies have already acknowledged that they perform scans of content uploaded through their platforms or services, on device, even within end-to-end encrypted environments.6  

Tech companies, including device manufacturers and operating system developers, should be required to invest in client-side scanning and blocking technology to prevent the production and distribution of CSAM – whether in video livestreams, end-to-end-encrypted environments, or everyday video chat apps where livestreamed CSAM commonly occurs. 

3. Increase Penalties for Non-Compliance 

A robust OSA should hold digital service providers to account for ensuring that their services uphold children’s right to privacy, safety and dignity, as set out in the UN Convention on the Rights of the Child7, and to protect them from experiencing violence and harm online. These recognised rights of children should outweigh the commercial interests of digital service providers, yet current penalties for failing to comply with the OSA may be seen as a cost of doing business where they are less than the cost of implementing meaningful child safety measures, or simply not paid at all. 

Under the current framework, the penalties that attach to a breach of the obligations under the OSA are significantly out of alignment with other jurisdictions. The low penalties that attach to non-compliance are set a maximum of 2,500 penalty points or $782,500. This belies the seriousness of the impact of the online harms caused by non-compliance with safety regulations, especially to the most vulnerable, such as children.  

The eSafety Commissioner should have the power to enforce civil penalties for failure to comply with their requirements under the OSA, including with respect to duty of care and BOSE requirements, of up to 10% of global annual turnover, in line with the UK’s Online Safety Act8 and Ireland’s Online Safety and Media Regulation Act9.  

4. Improve Reports to Law Enforcement

In conjunction with Australia’s OSA industry codes and standards, which require companies to proactively detect CSAM, the reporting requirements of electronic service providers should be strengthened under the revised OSA to better assist law enforcement to investigate instances of online child sexual abuse. Information from digital service providers can play a crucial role in investigating these crimes, which are both federal and state child sexual abuse offences and modern slavery offences in NSW. 

Digital services and platforms operating in Australia who are not already making reports of suspected child sexual exploitation to the US National Centre for Missing and Exploited Children should be required to report suspected child sexual abuse to the appropriate law enforcement agency in Australia and be required to preserve data and records related to the report.  

The OSA should set out a strict timeframe for making such reports and provide the eSafety Commissioner with the powers to mandate the type of information that service providers must include in reports to law enforcement. The OSA could also establish significant penalties for breach of this obligation to report suspected child sexual abuse.  

We collectively urge you, as Minister for Communications, to commit to legislating stronger requirements for tech companies operating in Australia to prevent the production and distribution of child sexual abuse material, including in livestreamed video.  

Yours sincerely, 

You might also be interested in…