Policies that outline what is and isn't allowed on the Facebook app.
Policies that outline what is and isn't allowed on the Instagram app.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
How we keep our platforms safe from groups and individuals that promote violence, terrorism, organized crime, and hate.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
by Louise Turner
Since the Oversight Board launched in 2020, I have witnessed the influence its decisions and recommendations have had on Meta, our products and our users. What started out as a bold experiment has matured into a stable and impactful institution, expanding the scope of its influence and the number of cases it decides. Our aim in sharing this report is to not only provide regular updates on the impact of the Board and to hold ourselves accountable, but also to showcase how a commitment to creating safe, positive experiences for users can be achieved through engagement with independent external accountability bodies..
As we approach the Board’s 5th year, it is continuing to evolve - driving more policy and product changes, addressing the most pressing global issues, and driving meaningful improvements across Meta.
Today, we are sharing our H1 Bi-Annual Report on the Oversight Board. This is our 12th report on the Board – all past reports are available in the Transparency Center – and our first report following the new bi-annual timeline since Meta and the Board each made the decision to shift away from a quarterly reporting cadence. This allows our teams to prioritize implementing the Board’s recommendations and align transparency reporting timelines with the per-half roadmaps our teams work towards. In the interim quarters, Meta will continue to provide confidential progress updates directly to the Board.
In today’s report, Meta is sharing some key developments that underscore the Board’s impact. Thanks to the Oversight Board’s recommendations, this past half we:
Began providing new information about AI-generated content, including new labels and an updated approach to related policies. This followed a series of recommendations from the Oversight Board where we agreed that providing transparency and additional context is an effective way to address these types of content at scale, while avoiding the risk of unnecessarily restricting speech.
Completed development of a new approach to retaining potential evidence of war crimes and serious violations of international human rights law.
Clarified our Coordinating Harm and Promoting Crime policy to better define content encouraging illegal participation in voting or census processes.
While showcasing the Board's influence on our operations is important, they've also pushed us to increase transparency about our work and its global impact. As a part of this work, we are sharing recommendation impact assessments. These analyses demonstrate how each Board recommendation creates far-reaching effects beyond individual cases.
An example of this in practice is how one recommendation from the Two Buttons Meme Case Decision led to the creation of a new pathway for users to indicate additional context in appeal submissions for any policy. This helps content reviewers understand when policy exceptions may apply, and enables users to better advocate in their appeals:
This work reaffirms the profound impact the Board’s partnership can have on our products and users. As we continue this journey, we look forward to sharing more on our progress. Engagement from external stakeholders, including the Board, and our commitment to innovation drive us to set new standards in digital governance.
To read the full H1 2024 Bi-Annual Update on the Oversight Board, it is available here.