Community Standards Enforcement Report
Community Standards Enforcement Report
Child Endangerment: Nudity and Physical Abuse and Sexual Exploitation
We do not allow content that endangers children, such as content that contains nudity or physical abuse or content that sexually exploits children on Facebook and Instagram. When we find this type of violating content, we remove it, regardless of the context or the person's motivation for sharing it. We may also disable the account of the person who shared it, unless it appears the intent was not malicious (for example, to spread awareness of child exploitation).
We report apparent child exploitation to the National Center for Missing and Exploited Children (NCMEC), a nonprofit that refers cases to law enforcement globally, in compliance with US law. We choose to remove content depicting non-sexualized child nudity to reduce the potential for abuse of the content by others.
Prevalence
How prevalent were child endangerment violations?
Content Actioned
How much child endangerment content did we take action on?
Proactive Rate
Of the violating content we actioned for child endangerment, how much did we find and action before people reported it?
Child Nudity and Sexual Exploitation
Child Nudity and Sexual Exploitation
Child Nudity and Sexual Exploitation
Correcting mistakes
People can appeal our decisions, unless there are extreme safety concerns. We restore content we incorrectly removed or when circumstances change. Restores can happen from appeals or when we identify issues ourselves.Appealed Content
How much of the content we actioned for child endangerment did people appeal?
Restored Content
How much actioned content for child endangerment were later restored?
Child Nudity and Sexual Exploitation
Child Nudity and Sexual Exploitation
Child Nudity and Sexual Exploitation