
"More than 70 million warning messages have been sent to people attempting to access child sexual abuse material (CSAM) online over the past two years, the Lucy Faithfull Foundation says. The messages are sent as part of Project Intercept, a partnership between the child protection charity and technology firms including Google, TikTok and Meta. Rather than simply blocking content, the messages highlight the illegality of viewing CSAM and direct users to support services aimed at changing behaviour."
"The foundation said nearly 700,000 people went on to access its Stop It Now resources, which offer confidential advice and self-help tools - a figure some experts say is disappointingly low. "Given that 70 million warning messages have been sent, the fact that only 700,000 people click through to get support seems low. This is disappointing, given that the scale of the problem of child sexual abuse imagery online is growing fast," said Professor Sonia Livingstone."
""On the other hand, since four in five of those people who seek support do engage with the resources provided, that suggests the system is working for those who are really motivated to get help." Lucy Faithfull Foundation Project Intercept is active in 131 countries and operates across a range of online spaces. These include end-to-end encrypted services - where only the sender and recipient can view what's sent - and AI chatbot platforms."
"The foundation did not specify how many individual users were responsible for the searches. But it said engagement with the support material had been high, with an average of 28,000 users a month redirected in 2024 and 2025. More than four in five continued to interact with the content, although the organisation did not publish data on longer-term behaviour change."
Over two years, more than 70 million warning messages were sent to people attempting to access child sexual abuse material online. The messages are part of Project Intercept, a partnership between the Lucy Faithfull Foundation and technology firms including Google, TikTok, and Meta. Instead of only blocking content, the warnings explain that viewing CSAM is illegal and direct users to support services intended to change behaviour. Nearly 700,000 people accessed Stop It Now resources offering confidential advice and self-help tools. The program operates in 131 countries across multiple online spaces, including end-to-end encrypted services and AI chatbot platforms. Engagement was reported as high, with about 28,000 users per month redirected in 2024 and 2025, and more than four in five continuing to interact with the content, though longer-term behaviour change data was not published.
#child-sexual-abuse-material #online-safety #behavior-change-interventions #technology-partnerships #digital-prevention
Read at www.bbc.com
Unable to calculate read time
Collection
[
|
...
]