Policies that outline what is and isn't allowed on the Facebook app.
Policies that outline what is and isn't allowed on the Instagram app.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
How we keep our platforms safe from groups and individuals that promote violence, terrorism, organized crime, and hate.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
Every day, teams at Meta make difficult decisions about what content should stay up and what should come down based on the Facebook Community Standards or Instagram Community Guidelines. But given the size of our community and the reach of our platform, we created the Oversight Board to bring accountability to those decisions.
In 2018, Meta CEO Mark Zuckerberg shared a blueprint outlining new and better ways for platforms like Facebook to remain accountable and bringing legitimacy to the rules that govern large communities on the internet.
With the input of scholars and experts, Meta put this note into action and devised an oversight board to provide an independent check on some of the most significant and difficult content decisions we make. The guiding idea behind the board was simple: Meta should not make so many important decisions about free expression and safety on our own.
Next, we underwent a global consultation process to better understand how to turn this vision into an institution. This consultation included input from experts all around the world, such as academics, technical experts, lawyers, designers and technologists, as well as input from members of the public. With this feedback, we were able to build the structures and documents that would serve as the foundation of governance for such a board. This included drafting a charter, establishing an independent trust and developing the board’s bylaws.
In May 2020, the Oversight Board’s first 20 members were announced—an esteemed and thoughtful group who’ve worked in a variety of positions, including as professors, journalists and heads of state.
Following this announcement, members underwent training on our Community Standards and Community Guidelines, policy development processes, enforcement frameworks and the types of content decisions in scope for the board. Members also had a rigorous orientation around a new case management tool developed by Meta. This tool allows members to securely access and review pertinent case information from anywhere in the world.
The board began hearing cases in October 2020. Since then, it has issued a number of decisions and recommendations, which have already changed the way we moderate content for the billions of people on Facebook and Instagram.
Our goal has always been for the board to exist for years to come as an important piece of our broader strategy for content moderation. We’re committed to supporting the board as it develops as an institution and expands to 40 members. This includes continually examining the board’s scope and working to bring additional types of content outlined in the bylaws into that scope, a process we’ve already begun.
We hope the board can serve as a model for the future of content governance across our industry, as it continues to provide invaluable input into how we make some of the most consequential decisions regarding freedom of expression across the globe.