Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
In May 2023, a Facebook user posted a video showing people attempting to break into a cell holding a suspected gang member at a police station in Haiti. The video included audio of someone shouting "we're going to break the lock" and "they're already dead," as well as a call for lynching. A Trusted Partner reported the potentially violating video to Meta 11 days after it was posted. Meta conducted a real-world harm assessment, which ultimately led to the removal of the content in question eight days later. In the Board’s decision in the Haitian Police Station case, it found that the content violated Facebook's Violence and Incitement Community Standard but disagreed with Meta on the application of the newsworthiness allowance. The Board concluded that Meta should have applied the newsworthiness allowance to keep up the content due to the delay between posting and enforcement. In response, the Board recommended that Meta assess the timeliness and effectiveness of its responses to content escalated through the Trusted Partner Program to address the risk of harm, particularly where Meta has limited proactive moderation tools or processes to identify and assess content.
What was the impact of Meta’s implementation of this recommendation?
In response to a recommendation from the Board in December 2023, Meta has made a commitment to share information on the timeliness and effectiveness of responses to content reported through the Trusted Partner Program. This Program seeks to foster deeper collaboration with civil society organizations to enhance Meta’s ability to identify and act on content that violates its policies. The Trusted Partner Program is a key part of Meta’s efforts to improve its content policies, enforcement processes, and products, to help keep users safe.
To highlight ongoing progress on implementing this recommendation, we are sharing metrics that are broken down across both a global and regional scale to demonstrate impact at various geographic levels. We leveraged the United Nations’ M49 standard for geographic regional grouping, as defined below:
Americas: Latin America and the Caribbean, Northern America
Africa: Northern Africa, Sub-Saharan Africa
Asia: Central Asia, Eastern Asia, South-eastern Asia, Southern Asia, Western Asia
Europe: Eastern Europe, Northern Europe, Southern Europe, Western Europe
Oceania: Australia and New Zealand
To evaluate the Program's effectiveness, we are utilizing the following metrics to assess the Program’s timeliness and effectiveness in responding to content reported by Trusted Partners:
Over a two-year period, spanning from Q2 2022 to Q4 2024, Meta has made substantial improvements in its response time to content reported through the Trusted Partner Program. Globally, the Trusted Partner Program received over 11,800 reported pieces of content in Q2 2022, which increased to over 49,200 reported pieces of content in Q2 2024, reflecting a 4-fold increase. At the regional level, the Trusted Partner Program also received an increased volume of reported content across all regions over the two-year period, as shown below:
The growing volume of Trusted Partner reports reflects the expansion of the program to cover new geographies and geo-political dynamics that may give rise to harmful online content. Notable examples include the Russian invasion of Ukraine, the conflict in Ethiopia, and the Israel-Hamas conflict. Additionally, significant investment in training, engagement and operational improvements have played a crucial role in increasing the volume of Trusted Partner reports. Between 2022 and 2024, Meta conducted structured engagements with Trusted Partners to strengthen their understanding of content policies and share best practices for reporting content. In parallel, Meta regularly consulted with Trusted Partners to gather qualitative insights on content trends and enhance processes for proactive detection and enforcement by collecting hashtags, and providing investigative signals for network takedowns. More details on the impact of this work are available in Meta’s Annual Human Rights Report from 2022 and 2023.
Globally, the percentage of cases resolved within 5 days of escalation increased from 69% in Q2 2022 to 81% in Q2 2024, reflecting a 12 percentage point improvement. At a regional level, there were increases in the percentage of cases resolved within 5 days of escalation over time, as shown below:
Incremental improvements in review efficiency enable Meta to address high risk content reported by Trusted Partners in a timely manner and prevent harm to users and the communities they represent.
Globally, the median turnaround time for Trusted Partner reports decreased from 1.02 days in Q2 2022 to 0.87 days in Q2 2024, reflecting a 15% improvement in review efficiency. Year-on-year reductions in turnaround time, despite increases in report volume, demonstrate the impact of improved review efficiency as a result of training efforts, streamlined enforcement systems, and new tooling.
Globally, the amount of reported content that was escalated for further policy review increased from over 40 in Q2 2022 to over 600 in Q2 2024, reflecting a 15-fold increase. The increase in the volume of content reported by Trusted Partners requiring further policy review shows the value of the Trusted Partner program in surfacing novel harms that are not adequately addressed by existing content policies or enforcement processes. In these cases, Trusted Partner reports can inform updates to policies and enforcement protocols so that future reports of the same nature can be processed more efficiently. The significant increase in such cases reflects the maturity of the program and Trusted Partners’ increasing ability to surface borderline and complex cases that help Meta improve its policies and enforcement processes.