
An op-ed by David Braga, IJM Australia’s CEO
An expectation of our society is that we reasonably prevent harm to others.
We put seatbelts on ourselves and our loved ones when driving. And we adhere to 40km speed limits in school zones to keep children safe.
We also expect this of the companies and services in our communities.
We expect banks to detect and investigate scams on our accounts. And we expect toy manufacturers to alert us to potential choking hazards in their products for children.
Why is it then, that we haven’t asked that same reasonable question of the technology platforms we use every day – to take responsibility for preventing harm?
Online safety received an important spotlight in Australia in 2024. Online bulling, scamming and sextortion have become part of our daily vernacular as we work collectively to think about how we can better – and quite reasonably – put in place measures that make the online world safe in the way we expect the physical world to be.
As we head into a federal election, the Government has announced a digital duty of care as an election commitment in response to the Online Safety Act review.
This Safer Internet Day, International Justice Mission (IJM) Australia is calling on the federal Opposition and Crossbench to commit to introducing a digital duty of care to protect children from online sexual exploitation and abuse on Australian screens.
The digital duty of care justifiably asks companies that profit from our online activities to play a role in keeping those platforms reasonably safe and to prevent harm.
It asks those companies to consider the risks their platforms and technologies might present, including to children exposed to their products, and to take reasonable steps to prevent the misuse of their platforms.
IJM is a global anti-slavery organisation that protects people in poverty from violence. One of the most prolific forms of violence against the poor today is the online sexual exploitation of children, which takes the form of livestreamed child sexual abuse by paying offenders using the same online platforms that you and I use every day.
Australia has a moral obligation to address this crime because we are consistently ranked as a high consumer of child sexual abuse material, and the highest per capita from the Philippines according to the Philippine Government.1
Shockingly, a recent study showed that 1 in 55 Australian men have admitted to engaged in sexually explicit webcamming with a child, and about the same number have paid for online sexual interactions, images or videos involving a child.2
A robust digital duty of care would require online service providers to consistently take reasonable steps to prevent foreseeable harms, including the risk their services could be used by Australians to livestream child sexual abuse material, and then act to prevent that harm.
This is not a requirement that is beyond the capabilities of the world’s biggest technology innovators. In fact, AI technology is already on market that detects and blocks child sexual abuse material in real time including in livestreamed video.
A digital duty of care is already in effect in both the UK and the EU. Australia has the opportunity to learn from their experiences to make sure that our digital duty of care is world class and works to protect children from livestreamed sexual abuse.
A digital duty of care simply places the same onus on technology service providers for harm prevention that we ask of each other every day. It’s a logical step that will protect us all, and we don’t think that is too much to ask.
[1] Online Sexual Abuse and Exploitation in the Philippines
[2] Identifying and understanding child sexual offending behaviours and attitudes among Australian men