Footage of Terrorist Attack in Moscow Bundle

UPDATED

JAN 17, 2025

2024-038-FB-UA, 2024-039-FB-UA, 2024-040-FB-UA

Today, July 11, 2024, the Oversight Board selected a case bundle appealed by Facebook users regarding three pieces of content. Each piece of content contains a video that depicts the moment of a terrorist attack on visible victims at a concert venue in Moscow with a caption that condemns the attack or expresses support for the victims.

In each instance, Meta took down this content for violating our Dangerous Organizations and Individuals policy, as laid out in the Facebook Community Standards.

Under our Dangerous Organizations and Individuals policy, “we do not allow content that glorifies, supports, or represents events that Meta designates as violating violent events.” Meta internally designated the Moscow attack as a violating violent event (a terrorist attack) on March 22, 2024. As a result, this means we remove “any third party imagery depicting the moment of the attack on visible victims,” even if shared to raise awareness, neutrally discuss, or condemn the attack.

We will implement the board’s decision once it has finished deliberating, and will update this post accordingly. Please see the Board's website for the decision when they issue it.

Case decision

We welcome the Oversight Board’s decision today, November 19, 2024, on this case. The Board overturned Meta’s decisions to remove all three pieces of content. Meta will act to comply with the Board's decision and reinstate the content with warning screens to Facebook within 7 days.

After conducting a review of the recommendations provided by the Board, we will update this post with initial responses to those recommendations.

Recommendation

Recommendation 1 (Assessing Feasibility)

To ensure its Dangerous Organizations and Individuals Community Standard is tailored to advance its aims, Meta should allow, with a “Mark as Disturbing” warning screen, third-party imagery of a designated event showing the moment of attacks on visible but not personally identifiable victims when shared in news reporting, condemnation and awareness-raising contexts.

The Board will consider this recommendation implemented when Meta updates the public-facing Dangerous Organizations and Individuals Community Standard in accordance with the above.

Our commitment:We will assess the feasibility of introducing a “Mark as Disturbing” warning screen option when third-party imagery of a designated event depicting the moment of attack is shared in the context of news reporting, condemnation, or awareness raising and does not include personally identifiable victims. This will require an assessment of the technical feasibility of implementing this option at-scale, as well as an assessment of potential impact of this option on our ability to quickly respond in moments of crisis.

Considerations:As part of our Dangerous Organizations and Individuals Community Standard, we define Violating Violent Events (VVEs) as an attempt or an intentional act of high-severity violence by a non-state actor against civilian targets outside the context of armed conflict or war. We designate these events, such as terrorist events or multiple-victim violence, when we determine the required signals are met and the totality of the circumstances surrounding the event warrant event designation enforcement. Upon designation, we prohibit all References, Glorification, Support, or Representation of the event or its perpetrators, and prohibit sharing certain kinds of imagery associated with the attack.

We recently conducted policy development on our approach VVEs, which included a Policy Forum discussion that the Board attended. Our policy development included consultation with global experts, research, and discussions with internal teams that respond to these events in order to align on changes to our previous approach to violating events. We also reviewed our commitments with the Global Internet Forum to Counter Terrorism, and considered all of our Community Standards to proactively address and respond to violent incidents by removing content in anticipation of any virality or encouraging copycat behavior. However, we also weighed the importance of expression and adopting proportionate penalties for sharing content that intends to condemn or raise awareness about these events. In instances where victims may be visible, we also considered our Community Standards value of dignity.

During our Policy Forum we evaluated an option to allow third-party content with a Mark as Disturbing screen. This option raised some concerns about the possibility of the content being repurposed by adversarial actors to glorify attacks or the attackers or normalizing acts of violence. However, we acknowledge the Board’s recommendation to further consider these potential tradeoffs, and as we note in our response to recommendation 2, we have implemented several changes to the VVE definition following our Policy Forum.

We will assess further approaches to violating events that balance voice, safety, and dignity in the aftermath of these events. Given the recency of our policy development on violating events, the complexity of adding a Mark as Disturbing option for a Community Standards area that does not use this enforcement option at scale, and other key considerations, we expect that this assessment will take time to fully complete. Due to the scope and complexity of this work, we expect that we will be able to provide a more detailed update on the status of this recommendation in 2026. We will share updates in future reports to the Oversight Board.

Recommendation 2 (Implementing Fully)

To ensure clarity, Meta should include a rule under the “We remove” section of the Dangerous Organizations and Individuals Community Standard and move the explanation of how Meta treats content depicting designated events out of the policy rationale section and into this section.

The Board will consider this recommendation implemented when Meta updates the public-facing Dangerous Organizations and Individuals Community Standard moving the rule on footage of designated events to the “We remove” section of the policy.

Our commitment: We plan to update our Community Standards with further details explaining our approach to Violating Violent Events and consider this recommendation implemented in full later this year.

Considerations: This year we plan to update our Community Standards with our definition of Violating Violent Events (VVEs). As also noted above, we define a VVE as an attempt or an intentional act of high-severity violence by a non-state actor against civilian targets outside the context of armed conflict or war. This external update and updates to our internal approach to VVEs was the result of extensive policy development and a Policy Forum discussion earlier in the year. Our policy development focused on the treatment of imagery from a violating event resulting in updates to our overall approach to content in the aftermath of these events. Once this change is implemented, we will provide an update in a future report to the Board.