At Thorn, we’re devoted to constructing cutting-edge know-how to defend youngsters from sexual abuse. Key to this mission is our little one sexual abuse materials (CSAM) and CSE detection resolution, Safer, which permits tech platforms to search out and report CSAM and text-based harms on their platforms. In 2024, we had extra corporations than ever deploy Safer on their platforms. This widespread dedication to little one security is vital to constructing a safer web and utilizing know-how as a power for good.
Safer’s 2024 Impression
Regardless that Safer’s group of consumers spans a variety of industries, all of them host content material uploaded by their customers or textual content inputs in generative engines and messaging options.
Safer empowers their groups to detect, assessment, and report CSAM and text-based little one sexual exploitation at scale. The scope of this detection is essential. It means their content material moderators and belief and security groups can discover CSAM amid the tens of millions of content material information uploaded and flag potential exploitation amid tens of millions of messages shared. This effectivity saves time and quickens their efforts. Simply as importantly, Safer permits groups to report CSAM or situations of on-line enticement to central reporting businesses, just like the Nationwide Middle for Lacking & Exploited Youngsters (NCMEC), which is essential for little one sufferer identification.
Safer’s prospects depend on our predictive synthetic intelligence and a complete hash database to assist them discover CSAM and potential exploitation. With their assist, we’re making strides towards lowering on-line sexual harms towards youngsters and making a safer web.
Complete information processed
In 2024, Safer processed 112.3 billion information enter by our prospects. At the moment, the Safer group includes greater than 60 platforms, with tens of millions of customers sharing an unbelievable quantity of content material every day. This represents a considerable basis for the necessary work of stopping repeated and viral sharing of CSAM on-line.
Complete potential CSAM information detected
Safer detected just below 2,000,000 photographs and movies of identified CSAM in 2024. This implies Safer matched the information’ hashes to verified hash values from trusted sources, figuring out them as CSAM. A hash is sort of a digital fingerprint, and utilizing them permits Safer to programmatically decide if that file has beforehand been verified as CSAM by NCMEC or different NGOs.
Along with detecting identified CSAM, our predictive AI detected greater than 2,200,000 information of potential novel CSAM. Safer’s picture and video classifiers use machine studying to foretell whether or not new content material is more likely to be CSAM and flag it for additional assessment. Figuring out and verifying novel CSAM permits it to be added to the hash library, accelerating future detection.
Altogether, Safer detected greater than 4,100,000 information of identified or potential CSAM.
Complete traces of textual content processed
Safer launched a textual content classifier characteristic in 2024 and processed greater than 3,000,000 traces of textual content in simply the primary 12 months. This functionality gives an entire new dimension of detection, serving to platforms determine sextortion and different abuse behaviors occurring through textual content or messaging options. In all, virtually 3,200 traces of potential little one exploitation have been recognized, serving to content material moderators reply to probably threatening habits.
Safer’s all-time influence
Final 12 months was a watershed second for Safer, with the group virtually doubling the all-time complete of information processed. Since 2019, Safer has processed 228.8 billion information and three million traces of textual content, ensuing within the detection of virtually 6.5 million potential CSAM information and practically 3,200 situations of potential little one exploitation. Each file processed, and each potential match made, helps create a safer web for kids and content material platform customers.
Construct a Safer web
Curbing platform misuse and addressing on-line sexual harms towards youngsters requires an “all-hands” strategy. Too many platforms nonetheless undergo from siloed groups, inconsistent practices, and coverage gaps that jeopardize efficient content material moderation. Thorn is right here to vary that, and Safer is the reply.