Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
2024-038-FB-UA, 2024-039-FB-UA, 2024-040-FB-UA
Today, July 11, 2024, the Oversight Board selected a case bundle appealed by Facebook users regarding three pieces of content. Each piece of content contains a video that depicts the moment of a terrorist attack on visible victims at a concert venue in Moscow with a caption that condemns the attack or expresses support for the victims.
In each instance, Meta took down this content for violating our Dangerous Organizations and Individuals policy, as laid out in the Facebook Community Standards.
Under our Dangerous Organizations and Individuals policy, “we do not allow content that glorifies, supports, or represents events that Meta designates as violating violent events.” Meta internally designated the Moscow attack as a violating violent event (a terrorist attack) on March 22, 2024. As a result, this means we remove “any third party imagery depicting the moment of the attack on visible victims,” even if shared to raise awareness, neutrally discuss, or condemn the attack.
We will implement the board’s decision once it has finished deliberating, and will update this post accordingly. Please see the Board's website for the decision when they issue it.
We welcome the Oversight Board’s decision today, November 19, 2024, on this case. The Board overturned Meta’s decisions to remove all three pieces of content. Meta will act to comply with the Board's decision and reinstate the content with warning screens to Facebook within 7 days.
After conducting a review of the recommendations provided by the Board, we will update this post with initial responses to those recommendations.