Meta

Meta
Policies
Community StandardsMeta Advertising StandardsOther policiesHow Meta improvesAge-appropriate content

Features
Our approach to dangerous organisations and individualsOur approach to the opioid epidemicOur approach to electionsOur approach to misinformationOur approach to newsworthy contentOur approach to Facebook Feed rankingOur approach to explaining rankingAccessibility at Meta

Research tools
Content Library and Content Library APIAd Library toolsOther research tools and datasets

Enforcement
Detecting violationsTaking action

Governance
Governance innovationOversight Board overviewHow to appeal to the Oversight BoardOversight Board casesOversight Board recommendationsCreating the Oversight BoardOversight Board: Further asked questionsMeta's biannual updates on the Oversight BoardTracking the Oversight Board's impact

Security
Threat disruptionsSecurity threatsThreat reporting

Reports
Community Standards enforcement reportIntellectual propertyGovernment requests for user dataContent restrictions based on local lawInternet disruptionsWidely viewed content reportRegulatory and other transparency reports

Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-appropriate content
Features
Our approach to dangerous organisations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library tools
Other research tools and datasets
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta's biannual updates on the Oversight Board
Tracking the Oversight Board's impact
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards enforcement report
Intellectual property
Government requests for user data
Content restrictions based on local law
Internet disruptions
Widely viewed content report
Regulatory and other transparency reports
Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-appropriate content
Features
Our approach to dangerous organisations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library tools
Other research tools and datasets
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards enforcement report
Intellectual property
Government requests for user data
Content restrictions based on local law
Internet disruptions
Widely viewed content report
Regulatory and other transparency reports
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta's biannual updates on the Oversight Board
Tracking the Oversight Board's impact
Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-appropriate content
Features
Our approach to dangerous organisations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library tools
Other research tools and datasets
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta's biannual updates on the Oversight Board
Tracking the Oversight Board's impact
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards enforcement report
Intellectual property
Government requests for user data
Content restrictions based on local law
Internet disruptions
Widely viewed content report
Regulatory and other transparency reports
English (UK)
Privacy PolicyTerms of ServiceCookies
Home
Oversight
Meta's First Quarterly Update on the Oversight Board

Meta’s First Quarterly Update on the Oversight Board

UPDATED 13 JUL 2022
When we launched the Oversight Board, we committed to consider and transparently respond to all of the board’s recommendations.
Today, we’re publishing our first quarterly update, covering Q1 2021, which provides (1) information about cases that Meta has referred to the board and (2) an update on our progress on implementing the board’s recommendations. These quarterly updates are designed to provide regular check-ins on the progress of this long-term work, while sharing more about how we approach these challenges. They are meant to hold us accountable to the board and the public.

Meta Referrals to the Board
In addition to providing users with direct access to appeal content decisions to the board, we regularly and proactively identify some of the most significant and difficult content decisions we’ve made on our platform and ask the board to review them. While the board notes when cases have been referred by Meta, we haven’t previously disclosed details about the cases we referred to the board that were not selected.
We refer cases involving issues that are severe, large-scale, and/or important for public discourse. Additionally, we look for content decisions that raise questions about current policies or their enforcement, with strong arguments on both sides for either removing or leaving up the content under review. We discussed how we prioritize content decisions for referral to the board in our Newsroom.
Meta teams with expertise on our content policies, our enforcement processes, and cultural context from regions around the world review the candidate cases and provide feedback on their significance and difficulty. We refer the most significant and difficult content decisions to the board, and the board has sole discretion to accept or decline those cases. As with appeals, the board’s decisions are binding. From November 2020 through March 31, 2021, we referred 26 content decisions to the board, and the board selected three: a case about supposed COVID-19 cures; a veiled threat based on religious beliefs; and a case about the decision to indefinitely suspend former US President Donald Trump’s account.

Our Progress on Non-binding Recommendations
In the first quarter of 2021, the board issued 18 recommendations in six cases. We are implementing fully or in part 14 recommendations, still assessing the feasibility of implementing three, and taking no action on one. The size and scope of the board’s recommendations go beyond the policy guidance that we first anticipated when we set up the board, and several require multi-month or multi-year investments. The board’s recommendations touch on how we enforce our policies, how we inform users of actions we’ve taken and what they can do about it, and additional transparency reporting. We welcome these recommendations — the changes they have sparked make Meta more transparent with users and the public, more consistent with our policy applications, and more proportional in our enforcement.
For example, last quarter, in response to the board’s recommendations, we launched and continue to test new user experiences that are more specific about why we remove content. We’ve made progress on the specificity of our hate speech notifications by using an additional classifier that is able to predict what kind of hate speech is in the content: violence, dehumanization, mocking hate crimes, visual comparison, inferiority, contempt, cursing, exclusion, and/or slurs. People using Facebook in English now receive more specific messaging when they violate our hate speech policy. We’ll roll out more specific notifications for hate speech violations to other languages in the future. And, as a result of the board’s recommendations, we’re running tests to assess the impact of telling people about whether automation was involved in enforcement. Additionally, we’ve updated our Dangerous Organizations and Individual policy, creating three tiers of content enforcement for different designations of severity and adding definitions of key terms.
We hope our responses also add to the dialogue around the challenges of content moderation at scale by providing more insight into tradeoffs. Where we disagree in part or whole with a board recommendation — or where implementation will take a long time — we explain why.

Future Reporting
The board’s impact comes not only from its binding decisions and recommendations on our policies and processes, but also from the public discourse surrounding the cases. We welcome the board’s feedback and review — along with feedback from the public — of our implementation of the recommendations, as well as how we can continue to improve.
See the full update for more information.
Meta
Transparency Centre
Policies
Enforcement
Security
Features
Governance
Research tools
Reports
English (UK)