Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
2023-043-FB-UA
Today, December 7, 2023, the Oversight Board selected a case to be considered under expedited review, a process by which the board issues accelerated content decisions within 30 days in exceptional circumstances.
The case was appealed by a Facebook user concerning a video that appears to show several individuals being abducted by Hamas. The video is accompanied by a caption that decries the October 7th, 2023 attack on Israel and asks people to watch the video to “gain a deeper understanding of the horror that Israel woke up to” and share the video “with the world so that everyone can witness the tragedy.”
Upon initial review, Meta took down this content for violating our policy on Dangerous Organizations and Individuals, as laid out in the Facebook Community Standards. However, when taking into account updated policy guidance since the original decision, we determined that the content should now be allowed because it was shared in a condemnatory and awareness raising context.
We will implement the board’s decision once it has finished deliberating, and we will update this post accordingly. Please see the board’s website for the decision when they issue it.
We welcome the Oversight Board’s decision today on this case. Both expression and safety are important to us and the people who use our services. The board overturned Meta’s original decision to take this content down but approved of the subsequent decision to restore the content with a warning screen. Meta previously reinstated this content so no further action will be taken on it.
As explained in our Help Center, some categories of content are not eligible for recommendations and the board disagrees with Meta barring the content in this case from recommendation surfaces. There will be no further updates to this case, as the board did not make any recommendations as part of their decision.
For more information on Meta’s ongoing efforts regarding the Israel-Hamas War, please see our Newsroom post.