Meta
Transparency Centre
Policies
Enforcement
Security
Features
Governance
Research tools
Reports
English (UK)

Meta
Policies
Community StandardsMeta Advertising StandardsOther policiesHow Meta improvesAge-appropriate content

Features
Our approach to dangerous organisations and individualsOur approach to the opioid epidemicOur approach to electionsOur approach to misinformationOur approach to newsworthy contentOur approach to Facebook Feed rankingOur approach to explaining rankingAccessibility at Meta

Research tools
Content Library and Content Library APIAd Library toolsOther research tools and datasets

Enforcement
Detecting violationsTaking action

Governance
Governance innovationOversight Board overviewHow to appeal to the Oversight BoardOversight Board casesOversight Board recommendationsCreating the Oversight BoardOversight Board: Further asked questionsMeta's biannual updates on the Oversight BoardTracking the Oversight Board's impact

Security
Threat disruptionsSecurity threatsThreat reporting

Reports
Community Standards enforcement reportIntellectual propertyGovernment requests for user dataContent restrictions based on local lawInternet disruptionsWidely viewed content reportRegulatory and other transparency reports

Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-appropriate content
Features
Our approach to dangerous organisations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library tools
Other research tools and datasets
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta's biannual updates on the Oversight Board
Tracking the Oversight Board's impact
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards enforcement report
Intellectual property
Government requests for user data
Content restrictions based on local law
Internet disruptions
Widely viewed content report
Regulatory and other transparency reports
Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-appropriate content
Features
Our approach to dangerous organisations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library tools
Other research tools and datasets
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards enforcement report
Intellectual property
Government requests for user data
Content restrictions based on local law
Internet disruptions
Widely viewed content report
Regulatory and other transparency reports
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta's biannual updates on the Oversight Board
Tracking the Oversight Board's impact
Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-appropriate content
Features
Our approach to dangerous organisations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library tools
Other research tools and datasets
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta's biannual updates on the Oversight Board
Tracking the Oversight Board's impact
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards enforcement report
Intellectual property
Government requests for user data
Content restrictions based on local law
Internet disruptions
Widely viewed content report
Regulatory and other transparency reports
English (UK)
Privacy PolicyTerms of ServiceCookies
Home
Policies
Community Standards
Account Integrity

Account integrity

Policy details
User experiences
Data

Policy details

CHANGE LOG
Today
27 Mar 2025
26 Dec 2024
25 Sep 2024
28 Mar 2024
25 Jan 2024
31 Aug 2023
22 Dec 2022
27 Oct 2022
24 Feb 2022
28 Oct 2021
29 Jul 2021
25 Mar 2021
17 Dec 2020
18 Nov 2020
28 May 2020
25 Oct 2019
1 Jul 2019
31 Aug 2018
27 Jul 2018
29 Jun 2018
Policy rationale
In order to maintain a safe environment and empower free expression, we restrict or remove accounts that are harmful to the community. We have built a combination of automated and manual systems to restrict and remove accounts that are used to egregiously or persistently violate our policies across any of our products.
Because account removal is a serious action, whenever possible, we aim to give our community opportunities to learn our rules and follow our Community Standards. For example, a notification is issued each time we remove content, and in most cases we also provide people with information about the nature of the violation and any restrictions that are applied. Our enforcement actions are designed to be proportional to the severity of the violation, the history of violations on the account and the risk or harm posed to the community. Continued violations, despite repeated warnings and restrictions, or violations that pose severe safety risks, will lead to an account being disabled.
Learn more about how Meta enforces its policies and restricts accounts in the Transparency Centre.
We may restrict or disable accounts, other entities (Pages, groups, events) or business assets (Business Managers, ad accounts) that:
  • Violate our Community Standards involving egregious harms, including those that we refer to law enforcement due to the risk of imminent harm to individual or public safety
  • Violate our Community Standards involving any harms that warrant referral to law enforcement due to the risk of imminent harm to individual of public safety
  • Violate our Advertising Standards involving deceptive or dangerous business harms
  • Persistently violate our Community Standards by posting violating content and/or managing violating entities or business assets
  • Persistently violate our Advertising Standards
  • Demonstrate activity or behaviour indicative of a clear violating purpose
We may restrict or disable accounts, other entities (Pages, groups, events) or business assets (Business Managers, ad accounts) that are:
  • Owned by the same person or entity as an account that has been disabled
  • Created or repurposed to evade a previous account or entity removal, including those assessed to have common ownership and content as previously removed accounts or entities
  • Created to contact a user that has blocked an account
  • Otherwise used to evade our enforcement actions or review processes
We may restrict or disable accounts, other entities (Pages, groups, events) or business assets (Business Managers, ad accounts) that demonstrate:
  • Close linkage with a network of accounts or other entities that violate or evade our policies
  • Coordination within a network of accounts or other entities that persistently or egregiously violate our policies
  • Activity or behaviour indicative of a clear violating purpose through a network of accounts
We will work to restrict or disable accounts or other entities (Pages, groups, events), or business assets (Business Managers, ad accounts) that engage in off-platform activity that can lead to harm on our platform, including those:
  • Owned by a convicted sex offender, convicted of offences related to the sexual abuse of children or adults
  • Owned by a designated entity or run on their behalf
  • Prohibited from receiving our products, services or software under applicable laws
In the following scenarios, we may request additional information about an account to ascertain ownership and/or permissible activity:
  • Compromised accounts
  • Creating or using an account or other entity through automated means, such as scripting (unless the scripting activity occurs through authorised routes and does not otherwise violate our policies)
  • Empty accounts with prolonged dormancy
User experiences
See some examples of what enforcement looks like for people on Facebook, such as: what it looks like to report something that you don't think should be on Facebook, to be told that you've violated our Community Standards and to see a warning screen over certain content.
Note: We're always improving, so what you see here may be slightly outdated compared to what we currently use.
USER EXPERIENCE
Reporting
USER EXPERIENCE
Post-report communication
USER EXPERIENCE
Takedown experience
USER EXPERIENCE
Warning screens
Data
View the latest Community Standards Enforcement Report
Enforcement
We have the same policies around the world, for everyone on Facebook.
Review teams
Our global team of over 15,000 reviewers work every day to keep people on Facebook safe.
Stakeholder engagement
Outside experts, academics, NGOs and policymakers help inform the Facebook Community Standards.
Get help with account integrity
Learn what you can do if you see something on Facebook that goes against our Community Standards.
Visit our Help Centre