Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
The Community Standards Enforcement Report measures how we are doing at enforcing our Community Standards on Facebook and our Community Guidelines on Instagram. Published quarterly, the report details—using text, numbers and charts—the amount of content violations we took action on from one quarter to another. It also showcases rising trends across various content and policy areas, depending on external, real-world events.
A variety of teams at Meta contribute to the publication of the report each quarter, including Content Policy, Integrity and Global Operations. This is a partnership—fighting abuse on Facebook requires us to work in lockstep with policy, product and operations. Here are some reasons why the report is an important piece of our collective efforts toward transparency:
It tracks our progress.
It keeps us accountable.
It pushes us to improve more quickly.
It offers people visibility into how we enforce our policies.
It invites questions about our enforcement decisions.
It gives people access to the same data we use to measure our progress internally.
The Community Standards Enforcement Report is a type of data transparency reporting that allows people, both inside and outside of Meta, to review our enforcement decisions and judge our performance. This includes how well our enforcement technology can detect and take action on violating content, and the fairness and accuracy of review team decisions. The report also demonstrates how we continually invest in technology and people to reduce harm in our community and promote safety — all while balancing freedom of expression.
Together, these components help keep us accountable for each policy area we report on in the report, which leads to increased accountability and responsibility over time. As we learn more about what is important to people and what works with each report, we can better understand how to continuously build and adapt our efforts and where we can improve.
Besides prevalence, which we consider our most important metric, the report measures other areas that paint a picture of our enforcement decisions. They include:
Estimates the percentage of times people see violating content on Facebook and Instagram.
Learn more about this metricThe number of pieces of content (like posts, photos, videos or comments) we take action on for violating our policies.
Learn more about this metricA percentage of all content or accounts acted on that we found and flagged before people reported them to us.
Learn more about this metricThe number of pieces of content that people appealed after we took action on them for violating our policies.
Learn more about this metricThe number of pieces of content we restored after originally taking action on them.
Learn more about this metricWe are continually assessing our metrics to learn how we can improve the ways we measure in our Community Standards Enforcement Report.