Australia

Our Kids Deserve to be Safe

By Grace Wong, Chief Advocacy Officer

Last week, a group of parents held up portraits of their deceased children at a US Senate congressional hearing attended by the world’s biggest and richest tech companies. I watched as images of dozens of children who had been exploited, abused or harmed online, were displayed in the hearing room to remind tech company CEOs of why these children took their lives and who should be held responsible.

When the internet first launched in 1983, we could not imagine that it would create the unimaginable – children made unsafe, in their own homes, by people close and far across the world.

I’ve worked in child protection and seen firsthand in the Philippines the lengths people, mainly men, will go to seek out livestreamed child sexual abuse for as little as $22 per ‘show’. Children unsafe in the US, in Australia and the Philippines are an indictment on us because children unsafe, abused and exploited anywhere, is a wrong that must be righted.

On Safer Internet Day, International Justice Mission (IJM) is shining a light on Australia’s part in the online sexual exploitation of children, and what we must do to prevent it. An IJM study in 2023 identified that half a million, or 1 in 100 children in the Philippines were trafficked to produce new child sexual abuse materials in 2022 alone*. To bring matters home, media reported on the arrests of two New South Wales men who are alleged perpetrators of online sexual exploitation of children in the Philippines in January this year**.

Australia’s passing of the Online Safety Act 2021 and creation of an e-Safety Commissioner is light years ahead of our US counterparts, who are struggling to stem the explosion of child sexual abuse or exploitation material created on tech platforms.

The Federal Government recently commenced a review into Australia’s basic online safety expectations, proposing that tech companies be required to consider the best interests of the child when developing new products and services. As part of the review, the Government should require tech companies to prevent online sexual exploitation of children through their platforms by building safety features into the design of any new products and services.

A regulation that allows tech companies to design their own safety features in response to risks, does not put their intellectual property at risk. Instead, it allows a tech company to use its own resources to make their products safe, from the very beginning.

AI products already on the market prove that it is possible for tech companies to detect child sexual abuse material with near perfect accuracy. Existing prevention technology such as SafeToNet’s SafeToWatch can be deployed on either the service itself or the device to prevent the creation, production, uploading or streaming of child sexual abuse material.

Last week, media and Swifties around the world were in uproar over pornographic AI images of Taylor Swift. Don’t children across the world, deserve the same level, if not more, outrage and support from us?


* IJM Scale of Harm, 2023

** AFP media release, 12/01/2024 and AFP media release, 13/01/2024

You might also be interested in…