AI offers endless possibilities for working more efficiently and uncovering new ideas. It has also made it much easier to create child sexual abuse material (CSAM).
In a disturbing new campaign titled #TakeItDown, the nonprofit ChildFund International illustrates just how easy it is for predators to “hide in plain sight” thanks to the internet and tech companies’ limited responsibilities when it comes to reporting and taking down CSAM.
As reported by the Washington Post in June, AI-generated CSAM has increased month-over-month since AI tools became more widely available to the public in Fall 2022. And while every U.S. state’s attorney general has signed a petition for Congress to further study and create guards against the proliferation of this content, there are still limited barriers to keep this type of content from being generated or distributed.
A video spot, created in partnership with social impact agency WRTHY, features a predator whose face quickly switches from unassuming when speaking to his kids or colleagues into a pale monster when using his computer.
“If you ask me, there’s never been a better time to be a monster,” he says after describing how technology has allowed predators the increased ability to review, rate and recommend CSAM.
The video is part of a multimedia campaign and also includes a widget which allows viewers to add their voices to demand action from policymakers, as well as a mini-documentary. Included in the short film are Sonya Ryan, whose daughter was murdered by an online predator, and Jim Cole, a retired Homeland Security Investigations special agent with knowledge of the technology tools that exist but aren’t used.
“Instead of being a place for learning, playing and connecting with friends and family, the internet has become a place rife with ways to exploit and abuse children,” said Erin Kennedy, ChildFund International’s vice president of external affairs and partnerships, in a press release.
Tech companies are not legally required to search for CSAM shared or held via their platforms. While they are required to report it once they have been made aware of its existence, these companies are not typically punished for neglecting to quickly remove it.
The National Center for Missing and Exploited Children noted in a report earlier this year that it receives around 80,000 reports to its Cyber TipLine each day, with a majority of the CSAM reported living on the open web (as opposed to the dark web, which is much harder to access).
This campaign puts the onus on tech companies, whose platforms serve as hosts for this media. “We want technology companies to recognize their responsibility,” noted Kennedy. “Profit should not come before the protection and well-being of children.”