Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
2023-018-FB-MR
Today, the Oversight Board selected a case referred by Meta regarding a video posted to Facebook of communal violence in the Indian state of Odisha. The video depicts an individual throwing an object at participants of a religious procession and members of the procession retaliating by throwing stones back. The accompanying audio contains multiple calls to “beat” or “hit.” In the weeks leading up to the video, numerous violent clashes were reported between Hindus and Muslims in a number of states across India.
Meta took down this content for violating our Violence and Incitement policy, as laid out in the Facebook Community Standards, and the post was removed from our platform.
Meta referred this case to the board because we found it significant and difficult as it creates tension between our values of safety and voice.
Meta prohibits content containing “threats that could lead to death (and other forms of high-severity violence)...targeting people or places.” Although a target is not expressly identified, it is clear from the context that the intended recipient is the individual on the balcony. This is consistent with our policy, which prohibits content calling for high-severity violence even when the target is not mentioned, but is instead depicted. Furthermore, while Meta recognizes there may be value associated with attempts to notify others of impending violence and current events, Meta determined the content does not fall under any policy exception or allowance.
We will implement the board’s decision once it has finished deliberating, and we will update this post accordingly. Please see the board’s website for the decision when they issue it.
We welcome the Oversight Board’s decision today on this case. The board upheld Meta's decision to remove the content so we will take no further action related to this case or the content.
Given there are no new recommendations associated with this case, there will be no further updates made to this page. Refer to our Quarterly Updates on the Oversight Board to track recommendation implementation progress across relevant previous decisions.