Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
NOV 12, 2024
Protocol originally launched Jan 30, 2023
Ordinarily, when people violate the Community Standards, they may be restricted from creating content, such as posting, commenting, using Facebook Live or creating a Page. These standard restrictions, which normally range from one-30 days, aim to eliminate additional violations for a set period of time and to deter people from future violations. We strive to keep restrictions proportionate to the violation they committed.
We may also disable accounts for repeatedly violating our Community Standards, despite repeated warnings and restrictions, or for certain very severe violations.
Our standard restrictions may not be proportionate to the violation, or sufficient to reduce the risk of further harm, in the case of public figures posting content during ongoing violence or civil unrest.
Public figures often have broader influence across our platforms; therefore, they may pose a greater risk of harm when they violate our policies. We define public figures as state and national level government officials, political candidates for those offices, people with over one million fans or followers on social media and people who receive substantial news coverage.
When determining the appropriate restriction for a public figure who has violated our policies in ways that incite or celebrate ongoing violent events or civil unrest, we may consider:
The severity of the violation and the public figure’s history on Facebook or Instagram, including current and past violations.
The public figure’s potential influence over, and relationship to, the individuals engaged in violence.
The severity of the violence and any related physical harm.
During times of civil unrest and ongoing violence, we use the above factors to determine the appropriate length of the restriction, ranging from one month to 2 years. For most violations, a public figure will have a one-month restriction from creating content. More serious violations, such as sharing a link to a statement from a terrorist group in the aftermath of an attack, will merit either a 6- or 12-month restriction from creating content. In cases where a violation is severe, we’ll restrict the account for 2 years.
At the end of the restriction period, we’ll look to experts to assess whether the risk to public safety has receded. We’ll evaluate external factors, including instances of violence, restrictions on peaceful assembly and other markers of global or civil unrest. If we determine that there is still a serious risk to public safety, we’ll extend the restriction for a set period of time and continue to re-evaluate until that risk has receded.
When a public figure’s restriction has expired and they regain access to Facebook or Instagram, they will be subject to heightened penalties to deter repeat offenses. While most new violations will trigger a one-month restriction from creating any content, more serious violations will merit a further 2-year restriction. As always, we may also disable any account that persistently posts violating content, despite repeated warnings and restrictions.
For content that does not violate our Community Standards but that contributes to the sort of risk that led to the public figure’s initial suspension, we may limit the distribution of such posts, and for repeated instances, may temporarily restrict access to our advertising tools. This step would mean that content would remain visible on the public figure’s account but would not be distributed in people's Feeds, even if they follow that public figure. We may also remove the reshare button from such posts, and may stop them from being recommended or run as ads. In the event that the public figure posts content that violates the letter of the Community Standards but, under our newsworthy content policy, we assess there is a public interest in knowing that the individual made the statement that outweighs any potential harm, we may similarly opt to restrict the distribution of such posts but leave them visible on the public figure’s account. We are taking these steps in light of the Oversight Board’s emphasis on high-reach and influential users and its emphasis on Meta’s role “to create necessary and proportionate penalties that respond to severe violations of its content policies.”
Update July 12, 2024
To ensure our users can hear from political candidates on our platforms, going forward, we will review the accounts subject to this protocol on a periodic basis to determine whether heightened penalties for Community Standards violations remain appropriate. We will make this determination by weighing our responsibility, as outlined by the Oversight Board, to “allow political expression” against our responsibility “to avoid serious risks to other human rights.”