By underreporting the frequency of child sexual abuse material (CSAM) traded and kept on its services, including iCloud, iMessage, and FaceTime, Apple allegedly is not doing enough to safeguard its most vulnerable users, child safety experts claim.
The child protection charity National Society for the Prevention of Cruelty to Children (NSPCC), based in the United Kingdom, claims to have obtained information using freedom of information enquiries implicating Apple in hundreds of CSAM events in England and Wales more than the company formally reported worldwide in a year.
Between April 2022 and March 2023, “Apple was implicated in 337 recorded offences of child abuse images between April 2022 and March 2023 in England and Wales.” Apple only recorded 267 cases of CSAM to the National Centre for Missing & Exploited Children (NCMEC) across all of its outlets worldwide in 2023, though.
The NCMEC’s annual report shows a sharp drop-off relative to other tech behemoths like Google and Meta, who recorded more than 1.47 million and 30.6 million, respectively, last year. Discord ( 339,412), Pinterest (52,356) and 4chan (1,656) are other sites that claimed more possible CSAM instances than Apple in 2023.
For reference, the NCMEC mandates that every tech business headquartered out of the United States share any potential CSAM cases found on its platforms. It then passes these cases to pertinent law enforcement authorities all around.
Although Apple services, including iMessage, FaceTime, and iCloud, all have end-to-end encryption—that is, only the sender and recipient of a message can view its contents—that does not fully explain why Apple is such an exception. WhatsApp reported around 1.4 million suspected CSAM incidents to NCMEC in 2023; as the NSPCC notes, it also employs end-to-end encryption.
Richard Collard, head of child safety online strategy for the NSPCC, said there is “a concerning discrepancy between the almost negligible number of global reports of abuse content they make to authorities and the number of UK child abuse image crimes taking place on Apple’s services”.
“Apple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety and preparing for the rollout of the Online Safety Act in the UK,” he said.
The charge follows years of debate concerning Apple’s intentions to increase platform surveillance to find materials about child sexual assault. Following the announcement in August 2021 of Apple’s child safety toolkit, which would scan iOS devices for images of child abuse, the company stopped operations just one month later as digital rights groups expressed worries about how its monitoring powers might compromise the privacy and security of iCloud users all around. Apple said in 2022 that it was shelving the project.
Apple reportedly changed its emphasis last autumn from the scanning capability to creating a suite of on-device capabilities meant to link consumers with local law enforcement and resources immediately.
If you or anyone you know have been affected by the people highlighted in this article, then please report those individuals to the Police on 101 (999 if an emergency) or visit their online resources for further details of the options for reporting a crime. You can also make a report at Crimestoppers should you wish to be completely anonymous. There is help available on our support links page.

