Meta

Meta
政策
《社群守則》Meta 廣告刊登準則其他政策Meta 如何改善政策適齡內容

功能
我們對危險組織和人物採取的具體作法我們對類鴉片止痛藥氾濫成災採取的具體作法我們對選舉採取的具體作法我們對錯誤資訊採取的具體作法我們對具報導價值的內容採取的具體作法我們對 Facebook 動態消息排序的具體作法我們對說明內容排序採取的具體作法Meta 的無障礙環境

研究工具
內容資料庫和內容資料庫 API廣告檔案庫工具其他研究工具和資料集

政策執行
偵測違規內容採取行動

管理
創新治理監察委員會總覽如何向監察委員會提出申訴監察委員會案例監察委員會建議成立監察委員會監察委員會:更深入的問題Meta 監察委員會半年一期報告追蹤監察委員會的影響

安全性
威脅干擾安全威脅威脅報告

報告
社群守則執行狀況報告智慧財產權政府對用戶資料的索取要求當地法律對內容的限制系統運作中斷廣泛瀏覽內容報告法規和其他資訊透明度報告

政策
《社群守則》
Meta 廣告刊登準則
其他政策
Meta 如何改善政策
適齡內容
功能
我們對危險組織和人物採取的具體作法
我們對類鴉片止痛藥氾濫成災採取的具體作法
我們對選舉採取的具體作法
我們對錯誤資訊採取的具體作法
我們對具報導價值的內容採取的具體作法
我們對 Facebook 動態消息排序的具體作法
我們對說明內容排序採取的具體作法
Meta 的無障礙環境
研究工具
內容資料庫和內容資料庫 API
廣告檔案庫工具
其他研究工具和資料集
政策執行
偵測違規內容
採取行動
管理
創新治理
監察委員會總覽
如何向監察委員會提出申訴
監察委員會案例
監察委員會建議
成立監察委員會
監察委員會:更深入的問題
Meta 監察委員會半年一期報告
追蹤監察委員會的影響
安全性
威脅干擾
安全威脅
威脅報告
報告
社群守則執行狀況報告
智慧財產權
政府對用戶資料的索取要求
當地法律對內容的限制
系統運作中斷
廣泛瀏覽內容報告
法規和其他資訊透明度報告
政策
《社群守則》
Meta 廣告刊登準則
其他政策
Meta 如何改善政策
適齡內容
功能
我們對危險組織和人物採取的具體作法
我們對類鴉片止痛藥氾濫成災採取的具體作法
我們對選舉採取的具體作法
我們對錯誤資訊採取的具體作法
我們對具報導價值的內容採取的具體作法
我們對 Facebook 動態消息排序的具體作法
我們對說明內容排序採取的具體作法
Meta 的無障礙環境
研究工具
內容資料庫和內容資料庫 API
廣告檔案庫工具
其他研究工具和資料集
安全性
威脅干擾
安全威脅
威脅報告
報告
社群守則執行狀況報告
智慧財產權
政府對用戶資料的索取要求
當地法律對內容的限制
系統運作中斷
廣泛瀏覽內容報告
法規和其他資訊透明度報告
政策執行
偵測違規內容
採取行動
管理
創新治理
監察委員會總覽
如何向監察委員會提出申訴
監察委員會案例
監察委員會建議
成立監察委員會
監察委員會:更深入的問題
Meta 監察委員會半年一期報告
追蹤監察委員會的影響
政策
《社群守則》
Meta 廣告刊登準則
其他政策
Meta 如何改善政策
適齡內容
功能
我們對危險組織和人物採取的具體作法
我們對類鴉片止痛藥氾濫成災採取的具體作法
我們對選舉採取的具體作法
我們對錯誤資訊採取的具體作法
我們對具報導價值的內容採取的具體作法
我們對 Facebook 動態消息排序的具體作法
我們對說明內容排序採取的具體作法
Meta 的無障礙環境
研究工具
內容資料庫和內容資料庫 API
廣告檔案庫工具
其他研究工具和資料集
政策執行
偵測違規內容
採取行動
管理
創新治理
監察委員會總覽
如何向監察委員會提出申訴
監察委員會案例
監察委員會建議
成立監察委員會
監察委員會:更深入的問題
Meta 監察委員會半年一期報告
追蹤監察委員會的影響
安全性
威脅干擾
安全威脅
威脅報告
報告
社群守則執行狀況報告
智慧財產權
政府對用戶資料的索取要求
當地法律對內容的限制
系統運作中斷
廣泛瀏覽內容報告
法規和其他資訊透明度報告
中文(台灣)
隱私政策服務條款Cookie
此內容尚未提供中文(台灣)版本

Home
Oversight
Oversight board cases
Footage of terrorist attack in Moscow

Footage of Terrorist Attack in Moscow Bundle

上次更新日期 2025年1月17日
2024-038-FB-UA, 2024-039-FB-UA, 2024-040-FB-UA
Today, July 11, 2024, the Oversight Board selected a case bundle appealed by Facebook users regarding three pieces of content. Each piece of content contains a video that depicts the moment of a terrorist attack on visible victims at a concert venue in Moscow with a caption that condemns the attack or expresses support for the victims.
In each instance, Meta took down this content for violating our Dangerous Organizations and Individuals policy, as laid out in the Facebook Community Standards.
Under our Dangerous Organizations and Individuals policy, “we do not allow content that glorifies, supports, or represents events that Meta designates as violating violent events.” Meta internally designated the Moscow attack as a violating violent event (a terrorist attack) on March 22, 2024. As a result, this means we remove “any third party imagery depicting the moment of the attack on visible victims,” even if shared to raise awareness, neutrally discuss, or condemn the attack.
We will implement the board’s decision once it has finished deliberating, and will update this post accordingly. Please see the Board's website for the decision when they issue it.
Read the board’s case selection summary

Case decision
We welcome the Oversight Board’s decision today, November 19, 2024, on this case. The Board overturned Meta’s decisions to remove all three pieces of content. Meta will act to comply with the Board's decision and reinstate the content with warning screens to Facebook within 7 days.
After conducting a review of the recommendations provided by the Board, we will update this post with initial responses to those recommendations.
Read the board’s case decision

Recommendations
On February 25, 2021, Meta responded to the board’s recommendation for this case. We are still assessing the feasibility of the recommendation.

Recommendation 1 (assessing feasibility)
To ensure its Dangerous Organizations and Individuals Community Standard is tailored to advance its aims, Meta should allow, with a “Mark as Disturbing” warning screen, third-party imagery of a designated event showing the moment of attacks on visible but not personally identifiable victims when shared in news reporting, condemnation and awareness-raising contexts.
The Board will consider this recommendation implemented when Meta updates the public-facing Dangerous Organizations and Individuals Community Standard in accordance with the above.
Our commitment: We will assess the feasibility of introducing a “Mark as Disturbing” warning screen option when third-party imagery of a designated event depicting the moment of attack is shared in the context of news reporting, condemnation, or awareness raising and does not include personally identifiable victims. This will require an assessment of the technical feasibility of implementing this option at-scale, as well as an assessment of potential impact of this option on our ability to quickly respond in moments of crisis.
Considerations: Over the past several years, we’ve invested in improving the experiences for people when we remove their content, and we have teams dedicated to continuing to improve these. As part of this work, we updated our notifications to inform people under which Community Standard a post was taken down (for example, Hate Speech, Adult Nudity and Sexual Activity, etc.), but we agree with the board that we’d like to provide more.
As part of our Dangerous Organizations and Individuals Community Standard, we define Violating Violent Events (VVEs) as an attempt or an intentional act of high-severity violence by a non-state actor against civilian targets outside the context of armed conflict or war. We designate these events, such as terrorist events or multiple-victim violence, when we determine the required signals are met and the totality of the circumstances surrounding the event warrant event designation enforcement. Upon designation, we prohibit all References, Glorification, Support, or Representation of the event or its perpetrators, and prohibit sharing certain kinds of imagery associated with the attack.
We recently conducted policy development on our approach VVEs, which included a Policy Forum discussion that the Board attended. Our policy development included consultation with global experts, research, and discussions with internal teams that respond to these events in order to align on changes to our previous approach to violating events. We also reviewed our commitments with the Global Internet Forum to Counter Terrorism, and considered all of our Community Standards to proactively address and respond to violent incidents by removing content in anticipation of any virality or encouraging copycat behavior. However, we also weighed the importance of expression and adopting proportionate penalties for sharing content that intends to condemn or raise awareness about these events. In instances where victims may be visible, we also considered our Community Standards value of dignity.
During our Policy Forum we evaluated an option to allow third-party content with a Mark as Disturbing screen. This option raised some concerns about the possibility of the content being repurposed by adversarial actors to glorify attacks or the attackers or normalizing acts of violence. However, we acknowledge the Board’s recommendation to further consider these potential tradeoffs, and as we note in our response to recommendation 2, we have implemented several changes to the VVE definition following our Policy Forum.
We will assess further approaches to violating events that balance voice, safety, and dignity in the aftermath of these events. Given the recency of our policy development on violating events, the complexity of adding a Mark as Disturbing option for a Community Standards area that does not use this enforcement option at scale, and other key considerations, we expect that this assessment will take time to fully complete. Due to the scope and complexity of this work, we expect that we will be able to provide a more detailed update on the status of this recommendation in 2026. We will share updates in future reports to the Oversight Board.

Recommendation 2 (Implementing Fully)
To ensure clarity, Meta should include a rule under the “We remove” section of the Dangerous Organizations and Individuals Community Standard and move the explanation of how Meta treats content depicting designated events out of the policy rationale section and into this section.
The Board will consider this recommendation implemented when Meta updates the public-facing Dangerous Organizations and Individuals Community Standard moving the rule on footage of designated events to the “We remove” section of the policy.
Our commitment: We plan to update our Community Standards with further details explaining our approach to Violating Violent Events and consider this recommendation implemented in full later this year.
Considerations: This year we plan to update our Community Standards with our definition of Violating Violent Events (VVEs). As also noted above, we define a VVE as an attempt or an intentional act of high-severity violence by a non-state actor against civilian targets outside the context of armed conflict or war. This external update and updates to our internal approach to VVEs was the result of extensive policy development and a Policy Forum discussion earlier in the year. Our policy development focused on the treatment of imagery from a violating event resulting in updates to our overall approach to content in the aftermath of these events. Once this change is implemented, we will provide an update in a future report to the Board.