Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
When we launched the Oversight Board, we committed to consider and transparently respond to all of the board’s recommendations.
Today, we’re publishing our first quarterly update, covering Q1 2021, which provides (1) information about cases that Meta has referred to the board and (2) an update on our progress on implementing the board’s recommendations. These quarterly updates are designed to provide regular check-ins on the progress of this long-term work, while sharing more about how we approach these challenges. They are meant to hold us accountable to the board and the public.
In addition to providing users with direct access to appeal content decisions to the board, we regularly and proactively identify some of the most significant and difficult content decisions we’ve made on our platform and ask the board to review them. While the board notes when cases have been referred by Meta, we haven’t previously disclosed details about the cases we referred to the board that were not selected.
We refer cases involving issues that are severe, large-scale, and/or important for public discourse. Additionally, we look for content decisions that raise questions about current policies or their enforcement, with strong arguments on both sides for either removing or leaving up the content under review. We discussed how we prioritize content decisions for referral to the board in our Newsroom .
Meta teams with expertise on our content policies, our enforcement processes, and cultural context from regions around the world review the candidate cases and provide feedback on their significance and difficulty. We refer the most significant and difficult content decisions to the board, and the board has sole discretion to accept or decline those cases. As with appeals, the board’s decisions are binding. From November 2020 through March 31, 2021, we referred 26 content decisions to the board, and the board selected three: a case about supposed COVID-19 cures ; a veiled threat based on religious beliefs ; and a case about the decision to indefinitely suspend former US President Donald Trump’s account .
In the first quarter of 2021, the board issued 18 recommendations in six cases. We are implementing fully or in part 14 recommendations, still assessing the feasibility of implementing three, and taking no action on one. The size and scope of the board’s recommendations go beyond the policy guidance that we first anticipated when we set up the board, and several require multi-month or multi-year investments. The board’s recommendations touch on how we enforce our policies, how we inform users of actions we’ve taken and what they can do about it, and additional transparency reporting. We welcome these recommendations — the changes they have sparked make Meta more transparent with users and the public, more consistent with our policy applications, and more proportional in our enforcement.
For example, last quarter, in response to the board’s recommendations, we launched and continue to test new user experiences that are more specific about why we remove content. We’ve made progress on the specificity of our hate speech notifications by using an additional classifier that is able to predict what kind of hate speech is in the content: violence, dehumanization, mocking hate crimes, visual comparison, inferiority, contempt, cursing, exclusion, and/or slurs. People using Facebook in English now receive more specific messaging when they violate our hate speech policy. We’ll roll out more specific notifications for hate speech violations to other languages in the future. And, as a result of the board’s recommendations, we’re running tests to assess the impact of telling people about whether automation was involved in enforcement. Additionally, we’ve updated our Dangerous Organizations and Individual policy, creating three tiers of content enforcement for different designations of severity and adding definitions of key terms.
We hope our responses also add to the dialogue around the challenges of content moderation at scale by providing more insight into tradeoffs. Where we disagree in part or whole with a board recommendation — or where implementation will take a long time — we explain why.
The board’s impact comes not only from its binding decisions and recommendations on our policies and processes, but also from the public discourse surrounding the cases. We welcome the board’s feedback and review — along with feedback from the public — of our implementation of the recommendations, as well as how we can continue to improve.
See the full update for more information.