Meta
Transparency Center
Policies
Enforcement
Security
Features
Governance
Research tools
Reports
English (US)

Meta
Policies
Community StandardsMeta Advertising StandardsOther policiesHow Meta improvesAge-Appropriate Content

Features
Our approach to dangerous organizations and individualsOur approach to the opioid epidemicOur approach to electionsOur approach to misinformationOur approach to newsworthy contentOur approach to Facebook Feed rankingOur approach to explaining rankingAccessibility at Meta

Research tools
Content Library and Content Library APIAd Library ToolsOther research tools and data catalogue

Enforcement
Detecting violationsTaking action

Governance
Governance innovationOversight Board overviewHow to appeal to the Oversight BoardOversight Board casesOversight Board recommendationsCreating the Oversight BoardOversight Board: Further asked questionsMeta’s Bi-Annual Updates on the Oversight BoardTracking the Oversight Board's Impact

Security
Threat disruptionsSecurity threatsThreat reporting

Reports
Community Standards Enforcement ReportIntellectual PropertyGovernment Requests for User DataContent Restrictions Based on Local LawInternet DisruptionsWidely Viewed Content ReportRegulatory and Other Transparency Reports

Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-Appropriate Content
Features
Our approach to dangerous organizations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library Tools
Other research tools and data catalogue
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta’s Bi-Annual Updates on the Oversight Board
Tracking the Oversight Board's Impact
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards Enforcement Report
Intellectual Property
Government Requests for User Data
Content Restrictions Based on Local Law
Internet Disruptions
Widely Viewed Content Report
Regulatory and Other Transparency Reports
Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-Appropriate Content
Features
Our approach to dangerous organizations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library Tools
Other research tools and data catalogue
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards Enforcement Report
Intellectual Property
Government Requests for User Data
Content Restrictions Based on Local Law
Internet Disruptions
Widely Viewed Content Report
Regulatory and Other Transparency Reports
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta’s Bi-Annual Updates on the Oversight Board
Tracking the Oversight Board's Impact
Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-Appropriate Content
Features
Our approach to dangerous organizations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library Tools
Other research tools and data catalogue
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta’s Bi-Annual Updates on the Oversight Board
Tracking the Oversight Board's Impact
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards Enforcement Report
Intellectual Property
Government Requests for User Data
Content Restrictions Based on Local Law
Internet Disruptions
Widely Viewed Content Report
Regulatory and Other Transparency Reports
English (US)
Privacy PolicyTerms of ServiceCookies
Home
Policies
Community Standards
Account Integrity

Account Integrity

Policy Details
User Experiences
Data

Policy details

CHANGE LOG
Today
Mar 27, 2025
Dec 26, 2024
Sep 25, 2024
Mar 28, 2024
Jan 25, 2024
Aug 31, 2023
Dec 22, 2022
Oct 27, 2022
Feb 24, 2022
Oct 28, 2021
Jul 29, 2021
Mar 25, 2021
Dec 17, 2020
Nov 18, 2020
May 28, 2020
Oct 25, 2019
Jul 1, 2019
Aug 31, 2018
Jul 27, 2018
Jun 29, 2018
Policy Rationale
In order to maintain a safe environment and empower free expression, we restrict or remove accounts that are harmful to the community. We have built a combination of automated and manual systems to restrict and remove accounts that are used to egregiously or persistently violate our policies across any of our products.
Because account removal is a serious action, whenever possible, we aim to give our community opportunities to learn our rules and follow our Community Standards. For example, a notification is issued each time we remove content, and in most cases we also provide people with information about the nature of the violation and any restrictions that are applied. Our enforcement actions are designed to be proportional to the severity of the violation, the history of violations on the account, and the risk or harm posed to the community. Continued violations, despite repeated warnings and restrictions, or violations that pose severe safety risks will lead to an account being disabled.
Learn more about how Meta enforces its policies and restricts accounts in the Transparency Center.
We may restrict or disable accounts, other entities (Pages, groups, events) or business assets (Business Managers, ad accounts) that:
  • Violate our Community Standards involving egregious harms, including those we refer to law enforcement due to the risk of imminent harm to individual or public safety
  • Violate our Community Standards involving any harms that warrant referral to law enforcement due to the risk of imminent harm to individual of public safety
  • Violate our Advertising Standards involving deceptive or dangerous business harms
  • Persistently violate our Community Standards by posting violating content and/or managing violating entities or business assets
  • Persistently violate our Advertising Standards
  • Demonstrate activity or behavior indicative of a clear violating purpose
We may restrict or disable accounts, other entities (Pages, groups, events) or business assets (Business Managers, ad accounts) that are:
  • Owned by the same person or entity as an account that has been disabled
  • Created or repurposed to evade a previous account or entity removal, including those assessed to have common ownership and content as previously removed accounts or entities
  • Created to contact a user that has blocked an account
  • Otherwise used to evade our enforcement actions or review processes
We may restrict or disable accounts, other entities (Pages, groups, events) or business assets (Business Managers, ad accounts) that demonstrate:
  • Close linkage with a network of accounts or other entities that violate or evade our policies
  • Coordination within a network of accounts or other entities that persistently or egregiously violate our policies
  • Activity or behavior indicative of a clear violating purpose through a network of accounts
We will work to restrict or disable accounts or other entities (Pages, groups, events), or business assets (Business Managers, ad accounts) that engage in off-platform activity that can lead to harm on our platform, including those:
  • Owned by a convicted sex offender, convicted of offences related to the sexual abuse of children or adults
  • Owned by a Designated Entity or run on their behalf
  • Prohibited from receiving our products, services or software under applicable laws
In the following scenarios, we may request additional information about an account to ascertain ownership and/or permissible activity:
  • Compromised accounts
  • Creating or using an account or other entity through automated means, such as scripting (unless the scripting activity occurs through authorized routes and does not otherwise violate our policies)
  • Empty accounts with prolonged dormancy
User experiences
See some examples of what enforcement looks like for people on Facebook, such as: what it looks like to report something you don’t think should be on Facebook, to be told you’ve violated our Community Standards and to see a warning screen over certain content.
Note: We’re always improving, so what you see here may be slightly outdated compared to what we currently use.
USER EXPERIENCE
Reporting
USER EXPERIENCE
Post-report communication
USER EXPERIENCE
Takedown experience
USER EXPERIENCE
Warning screens
Data
View the latest Community Standards Enforcement Report
Enforcement
We have the same policies around the world, for everyone on Facebook.
Review teams
Our global team of over 15,000 reviewers work every day to keep people on Facebook safe.
Stakeholder engagement
Outside experts, academics, NGOs and policymakers help inform the Facebook Community Standards.
Get help with account integrity
Learn what you can do if you see something on Facebook that goes against our Community Standards.
Visit our Help Center