You are currently viewing Apple ‘apparently underreports’ child sexual abuse, watchdogs say

Apple ‘apparently underreports’ child sexual abuse, watchdogs say

After years of controversy over plans to scan iCloud for more child sexual abuse material (CSAM), Apple abandoned those plans last year. Child safety experts have now accused the tech giant of not only failing to flag CSAM shared and stored across its services — including iCloud, iMessage and FaceTime — but also allegedly failing to report all CSAM that was flagged.

The UK’s National Society for the Prevention of Cruelty to Children (NSPCC) shared UK police data with The Guardian showing that Apple “significantly underestimates how often” CSAM is found globally on its services.

According to the NSPCC, police investigated more cases of CSAM in the UK alone in 2023 than Apple reported globally for the entire year. Between April 2022 and March 2023 in England and Wales, the NSPCC found that “Apple was involved in 337 recorded child abuse images”. But in 2023, Apple reported only 267 cases of CSAM to the National Center for Missing and Exploited Children (NCMEC), which is supposed to represent all CSAM on its platforms worldwide, The Guardian reported.

Major U.S. tech companies are required to report CSAM to NCMEC when they are discovered, but while Apple reports a few hundred CSAM cases a year, its big tech partners like Meta and Google report millions, the NCMEC report shows. Experts told The Guardian that concerns persist that Apple “apparently” does not account for CSAM on its platforms.

Richard Collard, the NSPCC’s head of children’s online safety policy, told The Guardian that he believed Apple’s child safety efforts needed major improvements.

“There is a worrying disparity between the number of UK child abuse image crimes committed on Apple services and the almost negligible number of global abuse content reports they make to authorities,” Collard told The Guardian. “Apple is clearly lagging behind many of its peers in tackling child sexual abuse at a time when all tech firms need to invest in safety and prepare for the introduction of the UK Online Safety Act.”

Outside the UK, other child safety experts share Collard’s concerns. Sarah Gardner, CEO of a Los Angeles-based child advocacy organization called the Heat Initiative, told The Guardian that she considers Apple’s platforms a “black hole” that hides CSAM. And she expects that Apple’s efforts to introduce AI into its platforms will exacerbate the problem, potentially facilitating the spread of AI-generated CSAM in an environment where sexual predators might expect less enforcement.

“Apple doesn’t detect CSAM at all in the majority of its environments at scale,” Gardner told The Guardian.

Gardner agreed with Collard that Apple is “clearly not accounting” and “hasn’t invested in trust and safety teams to be able to deal with this” as it rushes to introduce sophisticated AI features to its platforms. Last month, Apple integrated ChatGPT into Siri, iOS, and Mac OS, perhaps setting expectations for ever-improving generative AI features to be advertised in future Apple hardware.

“The company is moving forward into territory that we know can be incredibly harmful and dangerous to children without trying to address it,” Gardner told The Guardian.

Apple has so far not commented on the NSPCC report. Last September, Apple responded to the Heat Initiative’s requests for more CSAM detection, saying that instead of focusing on scanning for illegal content, its focus is on connecting vulnerable or affected users directly with local resources and law enforcement who can help them in their communities.

Leave a Reply