Meta

Meta
Policies
Community StandardsMeta Advertising StandardsOther policiesHow Meta improvesAge-appropriate content

Features
Our approach to dangerous organisations and individualsOur approach to the opioid epidemicOur approach to electionsOur approach to misinformationOur approach to newsworthy contentOur approach to Facebook Feed rankingOur approach to explaining rankingAccessibility at Meta

Research tools
Content Library and Content Library APIAd Library toolsOther research tools and datasets

Enforcement
Detecting violationsTaking action

Governance
Governance innovationOversight Board overviewHow to appeal to the Oversight BoardOversight Board casesOversight Board recommendationsCreating the Oversight BoardOversight Board: Further asked questionsMeta's biannual updates on the Oversight BoardTracking the Oversight Board's impact

Security
Threat disruptionsSecurity threatsThreat reporting

Reports
Community Standards enforcement reportIntellectual propertyGovernment requests for user dataContent restrictions based on local lawInternet disruptionsWidely viewed content reportRegulatory and other transparency reports

Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-appropriate content
Features
Our approach to dangerous organisations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library tools
Other research tools and datasets
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta's biannual updates on the Oversight Board
Tracking the Oversight Board's impact
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards enforcement report
Intellectual property
Government requests for user data
Content restrictions based on local law
Internet disruptions
Widely viewed content report
Regulatory and other transparency reports
Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-appropriate content
Features
Our approach to dangerous organisations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library tools
Other research tools and datasets
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards enforcement report
Intellectual property
Government requests for user data
Content restrictions based on local law
Internet disruptions
Widely viewed content report
Regulatory and other transparency reports
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta's biannual updates on the Oversight Board
Tracking the Oversight Board's impact
Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-appropriate content
Features
Our approach to dangerous organisations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library tools
Other research tools and datasets
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta's biannual updates on the Oversight Board
Tracking the Oversight Board's impact
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards enforcement report
Intellectual property
Government requests for user data
Content restrictions based on local law
Internet disruptions
Widely viewed content report
Regulatory and other transparency reports
English (UK)
Privacy PolicyTerms of ServiceCookies
Meta
Transparency Centre
Policies
Enforcement
Security
Features
Governance
Research tools
Reports
English (UK)
Home
Policies
Community Standards
Suicide, self-injury and eating disorders

Suicide, self-injury and eating disorders

Policy details
User experiences
Data

Policy details

CHANGE LOG
Today
15 May 2025
3 Oct 2024
26 Jul 2024
3 Jul 2024
2 May 2024
30 Dec 2023
29 Apr 2022
1 Oct 2021
5 May 2020
18 Dec 2020
19 Nov 2020
31 Jul 2020
2 Jul 2019
27 Apr 2019
21 Mar 2019
1 Sep 2018
Policy rationale
We care deeply about the safety of the people who use our apps. We regularly consult with experts in suicide, self-injury and eating disorders to help inform our policies and enforcement, and we work with organisations around the world to provide assistance to people in distress.
While we do not allow people to intentionally or unintentionally celebrate or promote suicide, self-injury or eating disorders, we do allow people to discuss these topics because we want our services to be a space where people can share their experiences, raise awareness about these issues and seek support from one another. We may also limit the ability to view this content to adults aged 18 and older.
We remove any content that encourages suicide, self-injury or eating disorders, including fictional content such as memes or illustrations and any self-injury content which is graphic, regardless of context. We remove content that contains instructions for extreme weight loss behaviour. We also remove content that mocks victims or survivors of suicide, self-injury or eating disorders, as well as real-time depictions of suicide or self-injury. Content about recovery from suicide, self-injury or eating disorders that is allowed, but may contain imagery that could be upsetting (such as a healed scar) is placed behind a sensitivity screen and we will also limit the ability to view the content to adults aged 18 and older.
When people post or search for suicide-, self-injury- or eating disorders-related content, we will direct them to local organisations that can provide support and if our Community Operations team is concerned about immediate harm, we will contact local emergency services to get them help. For more information, visit the Meta Safety Centre.
With respect to live content, experts have told us that if someone is saying that intend to attempt suicide on a live stream, we should leave the content up for as long as possible because the longer that someone is talking to a camera, the more opportunity there is for a friend or family member to call emergency services. However, to minimise the risk of others being negatively affected by viewing this content, we will stop the live stream at the point at which the threat turns into an attempt. As mentioned above, in any case, we will contact the emergency services if we identify that someone is at immediate risk of harming themselves.
We do not allow:
  • Content that promotes, encourages, coordinates or provides instructions for suicide, self-injury or eating disorders.
  • Content that depicts graphic suicide, self-injury or eating disorder imagery
  • Content depicting a person who engaged in a suicide attempt or death by suicide
  • Imagery of people when shared together with terms associated with eating disorders
  • Content that focuses on depiction of ribs, collar bones, thigh gaps, hips, concave stomach, protruding spine, scapula, visible bones in arms or legs or hollow cheeks when shared together with terms associated with eating disorders
  • Content that contains instructions for extreme weight loss behaviour
  • Content admitting to extreme weight loss behaviour when shared together with terms associated with eating disorders
  • Content that contains instructions for restrictive dieting when shared together with terms associated with eating disorders
  • Content that mocks victims or survivors of suicide, self-injury or eating disorders who are either publicly known or implied to be experiencing or have experienced suicide, self-injury or eating disorders
  • Imagery depicting body modification (e.g. tattoo, piercing, scarification, self-flagellation) when shared in a suicide or self-injury context
For the following content, we include a warning screen so that people are aware that the content may be sensitive. We also limit the ability to view the content to adults aged 18 and older:
  • Photos or videos depicting a person who engaged in euthanasia/assisted suicide in a medical setting.
  • Content that depicts older instances of self-injury such as healed cuts or other non-graphic self-injury imagery in a self-injury, suicide or recovery context
  • Content that depicts ribs, collar bones, thigh gaps, hips, concave stomach, protruding spine, scapula, visible bones in arms or legs or hollow cheeks in a recovery context
For the following content, we provide resources to people and limit the ability to view the content to adults aged 18 and older:
  • Written or verbal admission of suicide, self-injury or eating disorders.
  • Vague, potentially suicidal statements or references (including memes or stock imagery about sad mood depression, presenting death as an escape or content from popular culture with emphasis on dark, depressive thoughts) in a suicide or self-injury context.
For the following content, we limit the ability to view the content to adults aged 18 and older:
  • Mocking or dismissing the concept of suicide, self-injury or eating disorders
  • Narratives that contain a description of suicide with details that go beyond the mere naming or mentioning of the act or the aftermath
  • Admitting to extreme weight loss behaviour
For the following Community Standards, we require additional information and/or context to enforce:
  • We may remove suicide notes when we have confirmation of a suicide or suicide attempt. We try to identify suicide notes using several factors, including, but not limited to:
    • Family or legal representative requests,
    • Reports from media, law enforcement or other third-party sources (e.g. government agencies, NGOs) or the Suicidal content contact form or Instagram contact form.
User experiences
See some examples of what enforcement looks like for people on Facebook, such as: what it looks like to report something that you don't think should be on Facebook, to be told that you've violated our Community Standards and to see a warning screen over certain content.
Note: We're always improving, so what you see here may be slightly outdated compared to what we currently use.
USER EXPERIENCE
Reporting
USER EXPERIENCE
Post-report communication
USER EXPERIENCE
Takedown experience
USER EXPERIENCE
Warning screens
Data
View the latest Community Standards Enforcement Report
Enforcement
We have the same policies around the world, for everyone on Facebook.
Review teams
Our global team of over 15,000 reviewers work every day to keep people on Facebook safe.
Stakeholder engagement
Outside experts, academics, NGOs and policymakers help inform the Facebook Community Standards.
Get help with suicide, self-injury and eating disorders
Learn what you can do if you see something on Facebook that goes against our Community Standards.
Visit our Help Centre