How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
JUN 12, 2023
2022-002-FB-FBR
Today, the Oversight Board selected a case referred by Meta regarding a graphic video posted on a user’s Facebook profile page which shows a victim of violence following a military coup in Sudan in October 2021.
Upon initial review, Meta took down this content for violating our policy on Violent and Graphic Content, as laid out in the Facebook Community Standards. However, upon further review, we determined we removed this content in error and applied the newsworthiness allowance to reinstate the post with a warning screen added.
Meta referred this case to the board because we found it significant and difficult as it creates tension between our values of safety and voice.
Meta does not allow posting “videos of people or dead bodies in non-medical settings if they depict dismemberment,” however we must balance the risk of harm associated with sharing such graphic content with the public interest value of documenting human rights violations. Additionally, our newsworthiness allowance allows violating content on our platforms if keeping it visible is in the public interest.
We will implement the board’s decision once it has finished deliberating, and we will update this post accordingly. Please see the board’s website for the decision when they issue it.
We welcome the Oversight Board’s decision today on this case. Meta previously reinstated this content as it did not violate our policies and was removed in error, so no further action will be taken on this content.
After conducting a review of the recommendations provided by the board in addition to their decision, we will update this page.
Meta should amend the Violent and Graphic Content Community Standard to allow videos of people or dead bodies when shared for the purpose of raising awareness of or documenting human rights abuses. This content should be allowed with a warning screen so that people are aware that content may be disturbing. The Board will consider this recommendation implemented when Meta updates the Community Standard.
Our commitment: Given how closely linked the work on Sudan Graphic Video recommendations #1 and #2 are, we will be responding and conducting implementation work on these recommendations concurrently. Progress updates on these recommendations will be bundled. In response to these recommendations, we plan to conduct a policy development process to determine whether we should allow all graphic videos of people or dead bodies on our platforms with a warning screen when shared for the purpose of raising awareness of documenting human rights abuses. This policy development will include assessing what criteria we should consider to identify such content.
Considerations: Under our Violent and Graphic Content Community Standard, we generally remove videos of people or dead bodies in a non-medical setting where they are particularly graphic (for example, showing dismemberment or visible internal organs). Viewing this type of graphic content can potentially be harmful for people on our platforms, but there are instances where people post such videos to document or raise awareness of human rights abuses. We have typically handled these situations on a case-by-case basis because the assessments are nuanced and involve a careful balance between our values of privacy, safety, and voice.
It will be difficult to implement a general allowance for this type of graphic content given the scale at which we operate, but we recognize the importance of raising awareness of and documenting human rights abuses and are committed to assessing the feasibility of introducing this change. In order to effectively weigh tradeoffs and considerations in this space, we will conduct a robust policy development process to inform any potential policy and enforcement changes.
Next steps: We will plan to kick off this policy development process soon and will provide updates in a future Quarterly Report.
Meta should undertake a policy development process that develops criteria to identify videos of people or dead bodies when shared for the purpose of raising awareness of or documenting human rights abuses. The Board will consider this recommendation implemented when Meta publishes the findings of the policy development process, including information on the process and criteria for identifying this content at scale.
Our commitment: Given how closely linked the work on Sudan Graphic Video recommendations #1 and #2 are, we will be responding and conducting implementation work on these recommendations concurrently. Progress updates on these recommendations will be bundled. Please see our response to Sudan Graphic Video recommendation #1 for our commitment, considerations, and next steps.
Considerations: Please see our response to Sudan Graphic Video recommendation #1 for our commitment, considerations, and next steps.
Next steps: Please see our response to Sudan Graphic Video recommendation #1 for our commitment, considerations, and next steps.
Meta should make explicit in its description of the newsworthiness allowance all the actions it may take (for example, restoration with a warning screen) based on this policy. The Board will consider this recommendation implemented when Meta updates the policy.
Our commitment: In response to this recommendation, we will provide an update to our “Approach to Newsworthy Content” page in the Transparency Center outlining the actions we may take as part of our newsworthy allowance.
Considerations:We are committed to sharing more information about our newsworthy policy, and by the end of this month, August 2022, we will update our “Approach to Newsworthy Content” page in the Transparency Center to include details about actions we may take when content is considered potentially newsworthy. We will note in this page that, in some cases, this may include allowing content on the platform but applying a warning screen in accordance with our policies. We will also include new details on newsworthy allowances, including the number of allowances issued over the past year and the number of those allowances issued for content shared by politicians.
Next steps:The board issued similar recommendations in recommendation #11 in the Former President Trump's Suspension case and recommendation #2 in the Post Depicting Protests in Colombia While Using a Slur case. We will update our explanation of our newsworthy allowance this month and subsequently consider these recommendations complete.
To ensure users understand the rules, Meta should notify users when it takes action on their content based on the newsworthiness allowance including the restoration of content or application of a warning screen. The user notification may link to the Transparency Center explanation of the newsworthiness allowance. The Board will consider this implemented when Meta rolls out this updated notification to users in all markets and demonstrates that users are receiving this notification through enforcement data.
Our commitment: We are continuing to evaluate ways to inform people when content that violates our policies is left on our platforms due to a newsworthiness allowance. We’re working to build the tooling and processes needed to ensure that we are collecting the data consistently and accurately so that we can label all the content we deem as newsworthy in the future and aim to implement more user notifications by the end of the year.
Considerations: The board issued a similar recommendation in recommendation #4 in the Post Depicting Protests in Colombia While Using a Slur case. As we shared in our Q1 2022 Quarterly Update on that recommendation, we have updated the introduction of the Community Standards in our Transparency Center to link to more information about our approach to newsworthiness. As shared in response to recommendation #3, we will soon also expand that update with additional detail on the “Approach to Newsworthy Content” page in the Transparency Center. This includes describing the actions we may take as part of our newsworthy allowance. We are also continuing to evaluate ways to inform people when content assessed as violating our policies is left on our platforms because it is considered newsworthy. In line with our findings, we aim to implement some additional notifications to people on our platforms by the end of the year. We will continue to report on our progress in the next Quarterly Update.
Due to the similarity between the two recommendations, we will be tracking all future progress on this recommendation under Post Depicting Protests in Colombia While Using a Slur recommendation #4.
Next steps: We will share future progress on this recommendation under Post Depicting Protests in Colombia While Using a Slur recommendation #4, and report on our progress in the next Quarterly Update.