False Positive Volume

UPDATED DEC 11, 2025
What is false positive volume
False positive volume measures the total number of non-violating content items that are incorrectly removed by our enforcement systems or human reviewers on Facebook and Instagram. Unlike enforcement precision, which is a percentage, false positive volume is an estimated count that quantifies the real-world impact of enforcement mistakes.
Another way to think about false positive volume is: how many times did we take an action to remove content that actually complied with our Community Standards? This metric helps us understand the scale of unintended consequences and the number of users or creators affected by enforcement errors.
How we measure false positive volume
False positive volume is estimated by combining the volume of actioned content with measured precision over the same period of time. Knowing overall platform precision will most accurately reflect areas with the highest volume of actions, we avoid using it directly to calculate false positive volume. Instead, we measure precision for a narrower subset of the platform, and then combine those precision values with the corresponding enforcement volume to measure false positive volume on that subset. Each subset of false positive volume is then aggregated back together to produce the overall platform false positive volume.
For example, the Community Standards Enforcement Report shows that our enforcement actions 100s of millions of pieces of content on Facebook each quarter for spam, but less than 10 million pieces of content are actioned for violence and incitement. In this case, it would not be appropriate to apply overall precision from spam enforcement to the enforcement volume for violence and incitement. Instead, we measure precision for both harms independently and then apply the harm level precision to calculate false positive volume by harm before summing them up to the final total value for Facebook.
In the event that a specific granularity does not have enough samples to reliably calculate precision, we leverage the precision at the next nearest level of aggregation possible.
Why we measure false positive volume
We measure false positive volume to understand the real-world impact of enforcement mistakes on our community. False positives mean a person or creator was affected by incorrect actions, which can erode trust and create frustration.
Tracking this metric helps us:
  • Evaluate the effectiveness and fairness of our enforcement systems.
  • Identify when changes to automated systems or policies have led to more (or fewer) mistakes.
  • Prioritize improvements to reduce mistakes over time.
What Causes False Positive Volume to Change?
False positive volume can increase or decrease due to several factors within and outside of Meta’s control:
  • Changes in Enforcement Activity: If more content is actioned overall, even a stable enforcement precision will result in higher false positive volume.
  • Seasonal or Event-Driven Spikes: Major events or adversarial attacks can temporarily increase enforcement actions and, consequently, false positive volume.
  • Improvements in Detection: Better models or reviewer training can improve enforcement precision resulting in less false positives for the same number of actions.
Caveats
Since false positive volume is derived from our precision measurement, all precision caveats also apply to false positive volume.