Transparency into Meta’s Reports To the National Center for Missing and Exploited Children


SEP 6, 2023

As part of our ongoing work to provide young people with safe, positive online experiences, we’re providing more transparency into our efforts to find and report child exploitation to the National Center for Missing and Exploited Children (NCMEC).

We don't allow content or behavior that violates our policies against child sexual exploitation and we take steps to remove this content, report it to NCMEC, and liaise with law enforcement where appropriate.

Electronic Service Providers (ESPs) are legally obligated to report apparent violations of laws related to child sexual abuse material (CSAM) they become aware of to NCMEC’s CyberTipline. In addition to reporting content we become aware of, we’ve developed sophisticated technology to proactively seek out this content, and as a result we find and report more CSAM to NCMEC than any other service today. We make this technology available to the industry to help protect children from exploitation across the internet.

While NCMEC already publishes the total number of CyberTips it receives from ESPs on an annual basis, we will begin publishing additional data that demonstrates the types of reports we’re making to NCMEC. For example, we will start to provide insight into reports made to NCMEC that may include inappropriate interactions with children.

In Q2 2023, we reported the following number of CyberTips to NCMEC from Facebook and Instagram:

  • Facebook and Instagram sent over 3.7 million NCMEC Cybertip Reports for child sexual exploitation.

  • Of those reports, 48 thousand involved inappropriate interactions with children. Cybertips relating to inappropriate interactions with children may include an adult soliciting CSAM directly from a minor or attempting to meet and cause harm to a child in person. These CyberTips also include cases where a child is in apparent imminent danger.

  • 3.6 million related to shared or re-shared photos and videos that contain child sexual abuse material (CSAM).

We’ve developed more than 30 tools to support safe, positive online experiences for teens and their families. For example, we automatically set teens’ accounts to private when they join Instagram, we restrict people over 19 years old from sending private messages to teens who don’t follow them, we use age verification technology to help teens have age-appropriate experiences, and we have parental supervision tools that let parents see who their teen reports or blocks.

We’ve also developed sophisticated technology designed to prevent unwanted contact between teens and adults they haven’t chosen to follow. Our technology identifies accounts demonstrating potentially suspicious behavior, such as accounts belonging to adults that may have recently been blocked or reported by a teen, and prevents those accounts from finding, following or interacting with teens.

We don’t show teen accounts in Instagram Explore, Reels or ‘Accounts Suggested For You’ or in ‘People You May Know’ recommendations on Facebook to these adults, and we don’t show accounts belonging to these adults to teens in these places; we don’t recommend teen accounts to these adults in Instagram Search, nor do we recommend these adults to teens; and if these adults land on a teen’s account through other means, they will not be able to follow or message the account, nor are teens able to follow accounts belonging to these adults, for example.

We’ll continue collaborating with organizations like NCMEC and child safety experts to protect teens from unwanted contact with adults, while working to prevent the spread of CSAM online.