Policies that outline what is and isn't allowed on the Facebook app.
Policies that outline what is and isn't allowed on the Instagram app.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
2023-021-FB-MR
Today, the Oversight Board selected a case referred by Meta regarding a video posted to Facebook in which people urge vigilante violence against a person whom they suspect to be a member of a local gang. In the video, a group of people surround a locked cell containing a suspected gang member. One person yells “we’re going to break the lock” and another yells “they’re already dead.” Lastly, a person yells a phrase that references Bwa Kale – an anti-gang and vigilante movement in Haiti. Over the last several months gang violence in Haiti has risen considerably and vigilante movements, such as Bwa Kale, are suspected of retaliating via extrajudicial violence.
Meta took down this content for violating our Violence and Incitement policy, as laid out in the Facebook Community Standards, and the post was removed from our platform.
Meta referred this case to the board because we found it significant and difficult as it creates tension between our values of safety and voice.
Meta prohibits content containing “statements of intent to commit high-severity violence” or content that constitutes a “call to action” for people to commit high-severity violence against a targeted person. We define a “call to action” as “inviting or encouraging others to carry out harmful acts or to join the user or poster in executing harmful acts.”
We will implement the board’s decision once it has finished deliberating, and we will update this post accordingly. Please see the Board’s website for the decision when they issue it.
We welcome the Oversight Board’s decision today on this case. The board overturned Meta's decision to take down the content. Meta will act to comply with the board's decision and reinstate the content within 7 days.
After conducting a review of the recommendation provided by the board in addition to their decision, we will update this post.
To address the risk of harm, particularly where Meta has no or limited proactive moderation tools, processes, or measures to identify and assess content, Meta should assess the timeliness and effectiveness of its responses to content escalated through the Trusted Partner program.
The Board will consider this recommendation implemented when Meta both shares the results of this assessment with the Board – including the distribution of average time to final resolution for escalations originating from Trusted Partners disaggregated by country, Meta's own internal goals for time to final resolution, and any corrective measures it is taking in case those targets are not met – as well as publishes a public-facing summary of its findings to demonstrate it has complied with this recommendation.
Our commitment: We will assess the timeliness and effectiveness of our responses to content escalated through the Trusted Partner channel on an ongoing basis and will continue to monitor and improve going forward. We will also work to provide the Oversight Board and Trusted Partners with details on this assessment by the end of Q4 2024 and continue to explore opportunities for additional transparency in the future.
Considerations: The Trusted Partner program is a key component of Meta’s efforts to help keep people safe on our platforms through improvements to our policies, enforcement processes, and products. At any given time, we have over 250 partners reporting content from over 90 countries. We have a designated vetting process which considers organizational independence, representation, and expertise when selecting Trusted Partners to enroll in our program. Once onboarded, we have established reporting channels which our Trusted Partners use to directly escalate content to relevant teams. These partners help inform our content moderation efforts through their regional expertise and by providing crucial feedback on our content policies and enforcement tools based on their real-world impacts.
We currently use several methods to track and measure the effectiveness of the Trusted Partner program. This includes a series of metrics dashboards used to track volume of incoming cases on a weekly and monthly basis, the rate those cases are reviewed and closed, and the average time, measured in days, to review each case. This data can be segmented by different regions, abuse areas, and specific ongoing real world events. We use these metrics to analyze our performance and to inform improvements where necessary. Our turnaround times may vary by region or abuse area depending on staffing and available resources, and may be impacted by increased demand across regions and abuses. These performance metrics are routinely evaluated to identify opportunities for improvement. In addition, we work with Trusted Partners to make intake channels more efficient and gather the necessary insights to analyze context specific policies. Cross-functional stakeholders also meet for regular Trusted Partner program reviews to identify challenges, develop mitigation strategies, and analyze trends from all civil society partners.
Given the critical role of the Trusted Partner program, we regularly measure and evaluate its impact and our response to flags raised by Trusted Partners. These metrics and analyses allow us to evaluate the program’s efficiency. In response to this recommendation, we will also commit to exploring opportunities for additional transparency around these efforts. We will share updates on this progress in future public reports on the Oversight Board.