Digital Forensic News & Events
Bringing investigators digital forensics and cybersecurity related news from around the world. #AllinForensics


 

Back to News

What is CSAM? Key Terminology in the Fight Against Child Exploitation

Posted by Gina Cristiano on March 4, 2020
Gina Cristiano

Child pornography- more properly identified as Child Sexual Abuse Material (CSAM). It’s appalling that there is a need for a definition for this kind of material. U.S. Federal Law defines child pornography as any visual depiction of sexually explicit conduct involving a minor- meaning any person less than 18 years old. 

However, the phrase “child pornography” is almost too sterile and generic to properly exemplify the horrors of what is being created. That is why many advocates, including the National Center for Missing and Exploited Children (NCMEC), believe this phrase to be outdated.

NCMEC refers to these kinds of material as Child Sexual Abuse Material (CSAM), in order to “most accurately reflect what is depicted- the sexual abuse and exploitation of children”. 

As a result, many organizations and advocates now refer to this material by the new term rather than "child pornography" because it explicitly ties the material to the source of the problem; the abuse that is being perpetuated to create it. Furthermore, children are re-victimized every time a file is shared, sustaining the abuse in a continuous What is CSAM?loop. 

According to the Canadian Centre for Child Protection, 67% of child sexual abuse material survivors are impacted differently by the distribution of their images rather than hands-on abuse. The reason for this is tragic; distribution goes on perpetuity, and these images are permanent when they are constantly re-shared. 

The fight against child sexual abuse material and the propagation of sexual abuse material is far from over. NCMEC views over 25 million images annually, and the U.S. remains one of the largest producers of these images and videos, although this remains a global issue.

In 2002, NCMEC started collecting child abuse images as part of the Child Victim Identification Program, which aims to identify and rescue victims of child sexual abuse. ADF works closely with Rich Brown from Project VIC International, supports the Project VIC database standard, as well as the use of CAID (the Child Abuse Image Database, based in the UK), and creates software to help ICAC Task Force investigators identify victims and stop the cycles of abuse.

ADF Solutions software is used to speed CSAM investigations starting from on-scene investigations to back in the forensic lab. ADF software allows for auto and manual tagging for quicker results and provides reporting software that has investigators ready to share, regardless of whether or not others have ADF software.

Our reach is global, and in the US, we are proud to provide digital forensic image recognition and classification software that makes it easier for ICAC task forces to investigate images and videos of child sexual exploitation. 

Watch: Investigating Child Exploitation Cases

Topics: Law Enforcement, Digital Forensics, CAID, Crimes Against Children, ICAC, ICAC Task Force, Child Exploitation, Digital Evidence, CSAM

Posts by Tag

See all

Recent Posts

New ADF Free Trial Website Ad
  • READY TO ACCELERATE YOUR DIGITAL INVESTIGATIONS?