Tech Giants Under Fire for Failing to Curb Child Sexual Abuse Online, Says Australian eSafety Report
Canberra, August 6 (TheTrendingPeople.com): A new report released by Australia's eSafety Commissioner has sharply criticized major global technology companies for failing to take adequate steps to combat child sexual exploitation and abuse (CSEA) on their platforms. The scathing findings highlight “significant gaps” in both preventive mechanisms and transparency practices, casting doubt on the commitment of some of the world’s most well-funded firms to online child safety.
The report, released Wednesday, follows the issuance of transparency notices in July 2024 to eight major platforms: Apple, Google, Meta, Microsoft, Discord, WhatsApp, Snap, and Skype. The notices require each company to submit detailed reports every six months for two years, outlining their efforts to detect, block, and respond to CSEA content.
"Turning a Blind Eye": Commissioner’s Strong Words for Tech Platforms
In a powerful statement accompanying the report, eSafety Commissioner Julie Inman Grant condemned the lack of progress made by these companies, despite multiple earlier warnings.
"It shows that when left to their own devices, these companies aren't prioritising the protection of children and are seemingly turning a blind eye to crimes occurring on their services," Inman Grant said.
"No other consumer-facing industry would be given the licence to operate by enabling such heinous crimes against children on their premises, or services."
The 2025 report builds on similar findings from previous investigations conducted in 2022 and 2023, which also flagged insufficient safety practices across several platforms.
Major Gaps in Detection and Transparency
Among the most alarming findings in the latest report:
- None of the eight companies use tools to detect CSEA livestreaming across all of their services.
- Apple, Google, Microsoft, and Discord are not consistently using hash matching to identify known child abuse content throughout their platforms.
- Apple, Google, and WhatsApp do not currently block URL links to known CSEA material on any part of their services.
- Apple and YouTube were criticized for failing to track and disclose the number of user reports related to child abuse content and for not providing data on response times.
The findings suggest that companies with massive user bases and advanced technological resources are not deploying existing safety tools comprehensively, thus allowing abusive content to circulate undetected or unaddressed.
What Is Hash Matching and Why It Matters
Hash matching is a well-established technology used to detect known harmful material. It involves converting images or videos into unique digital “hashes” and comparing them to a database of identified abuse content. If platforms are not using hash matching across all services, they risk failing to flag content that has previously been reported and verified as exploitative.
Similarly, URL blocking prevents users from sharing or accessing web addresses that host abusive material. The report found that some platforms completely lack this basic mechanism, leaving open pathways for harmful material to proliferate.
Some Progress, But Far from Enough
While the report strongly critiques most platforms, it does acknowledge incremental progress in certain areas:
- Snap and Discord have implemented language analysis tools to detect signs of grooming, a tactic often used by abusers to lure children into exploitative situations.
- Some companies reported plans to expand artificial intelligence (AI) tools to detect previously unknown material.
Still, the Commissioner emphasized that these steps are insufficient given the scale and urgency of the problem.
“These are some of the most well-resourced companies in the world, and yet minimal progress has been made. Children are still at risk,” said Inman Grant.
Calls for Greater Accountability and Regulation
Australia has emerged as a global leader in tech regulation, particularly regarding online safety for children. The eSafety Office, established under the Enhancing Online Safety Act, has the power to compel companies to disclose how they manage harmful content and can take enforcement action when they fail to comply.
Under current rules, companies who do not respond adequately to the transparency notices may face legal and financial penalties. The findings from this report are expected to influence upcoming legislative reforms, including stricter obligations for digital platforms operating in Australia.
Inman Grant has also urged international cooperation, stressing that online abuse is a borderless issue requiring a unified global response.
Tech Companies Yet to Respond Publicly
As of the time of publication, none of the eight companies named in the report have issued official responses. However, some industry insiders argue that encryption technologies and privacy concerns complicate detection efforts, especially on end-to-end encrypted platforms like WhatsApp and Apple’s iMessage.
The eSafety Office has stated it is open to working collaboratively with companies to balance privacy and safety, but insists that child protection must come first.
Final Thoughts from TheTrendingPeople.com
The revelations in this report expose a troubling truth: despite years of warnings, some of the world’s largest tech companies are still falling short in safeguarding children from the darkest corners of the internet. While innovations like AI and real-time monitoring tools exist, their partial or inconsistent implementation speaks volumes.
The Australian eSafety Commissioner’s initiative is a wake-up call not just for companies operating in the country, but for the global tech industry at large. As online platforms continue to expand their reach, the responsibility to ensure safety—especially for vulnerable users—must not be optional.
At TheTrendingPeople.com, we believe that meaningful change can only occur when accountability is enforced. Australia is leading by example, and the rest of the world should take note. The safety of children online cannot remain a secondary priority—it must become a foundational principle of digital life.