Meta

Meta
Policies
Community StandardsMeta Advertising StandardsOther policiesHow Meta improvesAge-Appropriate Content

Features
Our approach to dangerous organizations and individualsOur approach to the opioid epidemicOur approach to electionsOur approach to misinformationOur approach to newsworthy contentOur approach to Facebook Feed rankingOur approach to explaining rankingAccessibility at Meta

Research tools
Content Library and Content Library APIAd Library ToolsOther research tools and data catalogue

Enforcement
Detecting violationsTaking action

Governance
Governance innovationOversight Board overviewHow to appeal to the Oversight BoardOversight Board casesOversight Board recommendationsCreating the Oversight BoardOversight Board: Further asked questionsMeta’s Bi-Annual Updates on the Oversight BoardTracking the Oversight Board's Impact

Security
Threat disruptionsSecurity threatsThreat reporting

Reports
Community Standards Enforcement ReportIntellectual PropertyGovernment Requests for User DataContent Restrictions Based on Local LawInternet DisruptionsWidely Viewed Content ReportRegulatory and Other Transparency Reports

Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-Appropriate Content
Features
Our approach to dangerous organizations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library Tools
Other research tools and data catalogue
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta’s Bi-Annual Updates on the Oversight Board
Tracking the Oversight Board's Impact
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards Enforcement Report
Intellectual Property
Government Requests for User Data
Content Restrictions Based on Local Law
Internet Disruptions
Widely Viewed Content Report
Regulatory and Other Transparency Reports
Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-Appropriate Content
Features
Our approach to dangerous organizations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library Tools
Other research tools and data catalogue
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards Enforcement Report
Intellectual Property
Government Requests for User Data
Content Restrictions Based on Local Law
Internet Disruptions
Widely Viewed Content Report
Regulatory and Other Transparency Reports
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta’s Bi-Annual Updates on the Oversight Board
Tracking the Oversight Board's Impact
Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-Appropriate Content
Features
Our approach to dangerous organizations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library Tools
Other research tools and data catalogue
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta’s Bi-Annual Updates on the Oversight Board
Tracking the Oversight Board's Impact
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards Enforcement Report
Intellectual Property
Government Requests for User Data
Content Restrictions Based on Local Law
Internet Disruptions
Widely Viewed Content Report
Regulatory and Other Transparency Reports
English (US)
Privacy PolicyTerms of ServiceCookies
Home
Policies
Community Standards
Child Sexual Exploitation Abuse Nudity

Child Sexual Exploitation, Abuse, and Nudity

Policy Details
User Experiences
Data

Policy details

CHANGE LOG
Today
Jul 31, 2025
Jul 17, 2025
Dec 27, 2024
Oct 2, 2024
Jul 25, 2024
Jul 8, 2024
Jan 12, 2024
Dec 6, 2023
Aug 2, 2023
Dec 23, 2022
Apr 28, 2022
Nov 25, 2021
Oct 28, 2021
Aug 26, 2021
May 4, 2021
Jan 29, 2021
Nov 19, 2020
Jun 22, 2020
Dec 29, 2018
Policy Rationale
We do not allow content or activity that sexually exploits or endangers children. When we become aware of apparent child exploitation, we report it to the National Center for Missing and Exploited Children (NCMEC), in compliance with applicable law. We know that sometimes people share nude images of their own children with good intentions; however, we generally remove these images because of the potential for abuse by others and to help avoid the possibility of other people reusing or misappropriating the images.
We also work with external experts, including the Meta Safety Advisory Board, to discuss and improve our policies and enforcement around online safety issues, especially with regard to children. Learn more about the technology we’re using to fight against child exploitation.
Do not post:
Child sexual exploitation
Content, activity, or interactions that threaten, depict, praise, support, provide instructions for, make statements of intent, admit participation in, or share links of the sexual exploitation of children (including real minors, toddlers, or babies, or non-real depictions with a human likeness, such as in art, AI-generated content, fictional characters, dolls, etc). This includes but is not limited to:
  • Sexual intercourse
    • Explicit sexual intercourse or oral sex, defined as mouth or genitals entering or in contact with another person's genitals or anus, when at least one person's genitals or anus is visible.
    • Implied sexual intercourse or oral sex, including when contact is imminent or not directly visible.
    • Stimulation of genitals or anus, including when activity is imminent or not directly visible.
    • Any of the above involving an animal.
  • Children with sexual elements, including but not limited to:
    • Restraints
    • Signs of arousal
    • Focus on genitals or anus
    • Presence of aroused adult
    • Presence of sex toys or use of any object for sexual stimulation, gratification, or sexual abuse
    • Sexualized costume
    • Stripping
    • Staged environment (for example, on a bed) or professionally shot (quality/focus/angles)
    • Open-mouth kissing
    • Stimulation of human nipples or squeezing of female breast (EXCEPT in the context of breastfeeding)
    • Presence of by-products of sexual activity
  • Content involving children in a sexual fetish context
  • Content that supports, promotes, advocates or encourages participation in pedophilia unless it is discussed neutrally in a health context
  • Content that identifies or mocks alleged victims of child sexual exploitation by name or image
Solicitation
Content that solicits sexual content or activity depicting or involving children, defined as:
  • Child Sexual Abuse Material (CSAM)
  • Nude imagery of real or non-real children
  • Sexualized imagery of real or non-real children
Inappropriate interactions with children
Content that constitutes or facilitates inappropriate interactions with children, such as:
  • Soliciting, arranging or planning sexual encounters with children
  • Enticing children to engage in sexual activity through sexualized conversations or offering, displaying, obtaining or requesting sexual material to or from children, through purposeful exposure or in private messages
  • Engaging in implicitly sexual conversations in private messages with children
  • Obtaining or requesting sexual material from children in private messages
Exploitative intimate imagery and sextortion
Content that attempts to exploit real children by:
  • Coercing money, favors or intimate imagery with threats to expose real or non-real intimate imagery or information
  • Sharing, threatening, or stating an intent to share private sexual conversations or real or non-real intimate imagery
Sexualization of children
  • Content (including photos, videos, real-world art, digital content, and verbal depictions) that sexualizes real or non-real children
  • Groups, Pages, and profiles dedicated to sexualizing real or non-real children
Child nudity
Content that depicts real or non-real child nudity where nudity is defined as:
  • Close-ups of real or non-real children’s genitalia
  • Real or non-real nude toddlers, showing:
    • Visible genitalia, even when covered or obscured by transparent clothing
    • Visible anus and/or fully nude close-up of buttocks
  • Real or non-real nude minors, showing:
    • Visible genitalia (including genitalia obscured only by pubic hair or transparent clothing)
    • Visible anus and/or fully nude close-up of buttocks
    • Uncovered female nipples
    • No clothes from neck to knee - even if no genitalia or female nipples are showing
  • Unless the non-real imagery is for health purposes or is a non-sexual depiction of child nudity in real-word art
Non-sexual child abuse
Videos or photos that depict real or non-real non-sexual child abuse regardless of sharing intent, unless the imagery is from real-world art, cartoons, movies or video games
Content that praises, supports, promotes, advocates for, provides instructions for or encourages participation in non-sexual child abuse
In addition to removing accounts that violate our Child Sexual Exploitation, Abuse and Nudity (CSEAN) policies, our reviewers and automated systems consider a broad spectrum of signals to help prevent potentially unwanted or unsafe interactions.
  • We may disable accounts or restrict access to products and features (e.g. the ability to follow certain accounts) for adults based on their interactions with other accounts, searches for or interactions with violating content (e.g. liking or saving), or membership in communities (e.g. Groups) we have removed for violating our policies.
For the following content, we include a warning screen so that people are aware the content may be disturbing and limit the ability to view the content to adults ages eighteen and older:
  • Videos or photos that depict police officers or military personnel committing non-sexual child abuse
  • Videos or photos of non-sexual child abuse, when law enforcement, child protection agencies, or trusted safety partners request that we leave the content on the platform for the express purpose of bringing a child back to safety
For the following content, we include a sensitivity screen so that people are aware the content may be upsetting to some:
  • Videos or photos of violent immersion of a child in water in the context of religious rituals
For the following Community Standards, we require additional information and/or context to enforce:
For the following content, we include a warning label so that people are aware that the content may be sensitive:
  • Imagery posted by a news agency that depicts child nudity in the context of famine, genocide, war crimes, or crimes against humanity, unless accompanied by a violating caption or shared in a violating context, in which case the content is removed
We may remove imagery depicting the aftermath of non-sexual child abuse when reported by news media partners, NGOs, or other trusted safety partners.
We may remove content that identifies alleged victims of child sexual exploitation through means other than name or image if content includes information that is likely to lead to the identification of the individual.
We may remove content created for the purpose of identifying a private minor if there is a risk to the minor’s safety, when requested by Law Enforcement, Government, Trusted Partner, or the content is self-reported by the minor or the minor’s parent/legal guardian
User experiences
See some examples of what enforcement looks like for people on Facebook, such as: what it looks like to report something you don’t think should be on Facebook, to be told you’ve violated our Community Standards and to see a warning screen over certain content.
Note: We’re always improving, so what you see here may be slightly outdated compared to what we currently use.
USER EXPERIENCE
Reporting
USER EXPERIENCE
Post-report communication
USER EXPERIENCE
Takedown experience
USER EXPERIENCE
Warning screens
Data
View the latest Community Standards Enforcement Report
Enforcement
We have the same policies around the world, for everyone on Facebook.
Review teams
Our global team of over 15,000 reviewers work every day to keep people on Facebook safe.
Stakeholder engagement
Outside experts, academics, NGOs and policymakers help inform the Facebook Community Standards.
Get help with child sexual exploitation, abuse and nudity
Learn what you can do if you see something on Facebook that goes against our Community Standards.
Visit our Help Center