Meta

Meta
政策
社群守则Meta 广告发布守则其他政策Meta 如何改进工作适龄内容

功能
我们打击危险组织和人物的方法我们对阿片类药物泛滥的处理方式我们维护诚信选举的方法我们打击错误信息的方法我们评估内容新闻价值的方法我们的 Facebook 动态版块内容排名方法我们对内容排名方法的解释Meta 无障碍理念

研究工具
内容库与内容库 API广告资料库工具其他研究工具和数据集

政策执行
检测违规内容采取措施

治理
治理创新监督委员会概览如何向监督委员会申诉监督委员会案件监督委员会建议设立监督委员会监督委员会:其他问题Meta 关于监督委员会的半年度更新报告追踪监督委员会的影响力

安全
威胁中断安全威胁威胁行为处理报告

报告
社群守则执行情况报告知识产权政府的用户数据收集情况依据当地法律实施内容限制网络中断广泛浏览内容报告监管报告和其他透明度报告

政策
社群守则
Meta 广告发布守则
其他政策
Meta 如何改进工作
适龄内容
功能
我们打击危险组织和人物的方法
我们对阿片类药物泛滥的处理方式
我们维护诚信选举的方法
我们打击错误信息的方法
我们评估内容新闻价值的方法
我们的 Facebook 动态版块内容排名方法
我们对内容排名方法的解释
Meta 无障碍理念
研究工具
内容库与内容库 API
广告资料库工具
其他研究工具和数据集
政策执行
检测违规内容
采取措施
治理
治理创新
监督委员会概览
如何向监督委员会申诉
监督委员会案件
监督委员会建议
设立监督委员会
监督委员会:其他问题
Meta 关于监督委员会的半年度更新报告
追踪监督委员会的影响力
安全
威胁中断
安全威胁
威胁行为处理报告
报告
社群守则执行情况报告
知识产权
政府的用户数据收集情况
依据当地法律实施内容限制
网络中断
广泛浏览内容报告
监管报告和其他透明度报告
政策
社群守则
Meta 广告发布守则
其他政策
Meta 如何改进工作
适龄内容
功能
我们打击危险组织和人物的方法
我们对阿片类药物泛滥的处理方式
我们维护诚信选举的方法
我们打击错误信息的方法
我们评估内容新闻价值的方法
我们的 Facebook 动态版块内容排名方法
我们对内容排名方法的解释
Meta 无障碍理念
研究工具
内容库与内容库 API
广告资料库工具
其他研究工具和数据集
安全
威胁中断
安全威胁
威胁行为处理报告
报告
社群守则执行情况报告
知识产权
政府的用户数据收集情况
依据当地法律实施内容限制
网络中断
广泛浏览内容报告
监管报告和其他透明度报告
政策执行
检测违规内容
采取措施
治理
治理创新
监督委员会概览
如何向监督委员会申诉
监督委员会案件
监督委员会建议
设立监督委员会
监督委员会:其他问题
Meta 关于监督委员会的半年度更新报告
追踪监督委员会的影响力
政策
社群守则
Meta 广告发布守则
其他政策
Meta 如何改进工作
适龄内容
功能
我们打击危险组织和人物的方法
我们对阿片类药物泛滥的处理方式
我们维护诚信选举的方法
我们打击错误信息的方法
我们评估内容新闻价值的方法
我们的 Facebook 动态版块内容排名方法
我们对内容排名方法的解释
Meta 无障碍理念
研究工具
内容库与内容库 API
广告资料库工具
其他研究工具和数据集
政策执行
检测违规内容
采取措施
治理
治理创新
监督委员会概览
如何向监督委员会申诉
监督委员会案件
监督委员会建议
设立监督委员会
监督委员会:其他问题
Meta 关于监督委员会的半年度更新报告
追踪监督委员会的影响力
安全
威胁中断
安全威胁
威胁行为处理报告
报告
社群守则执行情况报告
知识产权
政府的用户数据收集情况
依据当地法律实施内容限制
网络中断
广泛浏览内容报告
监管报告和其他透明度报告
中文(简体)
隐私政策服务条款Cookie
这条内容尚无中文(简体)版本

Home
Oversight
Oversight Board Cases
Comment Related To Armenian People And The Armenian Genocide

Case on a comment related to the Armenian people and the Armenian Genocide

更新日期 2023年6月12日
2021-005-FB-UA
On March 2, 2021, the Oversight Board selected a case appealed by someone on Facebook regarding a comment with a meme depicting Turkey having to choose between “The Armenian Genocide is a lie” and “The Armenians were terrorists who deserved it.”
Meta took down this content for violating our policy on hate speech, as laid out in the Facebook Community Standards. We do not allow hate speech on Facebook, even in the context of satire, because it creates an environment of intimidation and exclusion, and in some cases, may promote real-world violence.
Read the board’s case selection summary

Case decision
We welcome the Oversight Board's decision today on this case. Meta has acted to comply with the board’s decision immediately, and this content has been reinstated.
In accordance with the bylaws, we will also initiate a review of identical content with parallel context. If we determine that we have the technical and operational capacity to take action on that content as well, we will do so promptly. For more information, please see our Newsroom post about how we will implement the board’s decisions. We will update this post again once any further action is taken on other identical content with parallel context.
After conducting a review of the recommendation provided by the board in addition to their decision, we will update this post.
Read the board’s case decision

Recommendations
On June 17, 2021, Meta responded to the board’s recommendation for this case. We are committing to take action on the recommendation.

Recommendation 1 (assessing feasibility)
Meta should make technical arrangements to ensure that notice to users refers to the Community Standard enforced by the company.
Our commitment: We agree that providing people with accurate information about why we have taken down their content is important. We are assessing how best to do so.
Considerations: Our content moderators assess a piece of content against all of our Community Standards. If content is found to be violating our Community Standards, content moderators will select which policy was violated, but at this time can only select one violation type, even if the content violates multiple Community Standards. On appeal, if a content moderator finds that the content should instead be marked for violating a different Community Standard, the reviewer assigns a new violation to reflect the correct one.
We need to explore the benefit to user experience that could come from informing users of multiple violations and multiple appeal opportunities resulting from a single piece of content. Additionally, changing the technical ability, process, and training for how content moderators select policy violations for a piece of content, and the appeals that may follow, creates new operational complexity that we need to evaluate.
Next steps: We plan to complete our assessment and update on our progress by the end of the year.

Recommendation 2 (implementing fully)
Meta should include the satire exception, which is currently not communicated to users, in the public language of the Hate Speech Community Standard.
Our commitment: We’ll add information to the Community Standards that makes it clear where we consider satire as part of our assessment of context-specific decisions.
Considerations: This change will allow teams to consider satire when assessing potential Hate Speech violations.
Next steps: We plan to complete this update by the end of this year.

Recommendation 3 (assessing feasibility)
Meta should make sure that it has adequate procedures in place to assess satirical content and relevant context properly including by providing content moderators with additional resources.
Our commitment: We commit to provide regional and escalations teams the ability to evaluate content for satire through a new satire framework. We also are assessing how to apply this review at scale.
Considerations: As stated in our response to recommendation 2, we will add information to the Community Standards that makes it clear where we consider satire as part of our assessment of context-specific decisions. This work will include implementing a new satire framework, which our teams will use for evaluating potential satire exceptions. Regional teams will be able to provide satire assessments, as well as escalate pieces of content to specialized teams for an additional review when necessary.
We previously began developing a framework for assessing humor and satire and are prioritizing completing it based on the board’s recommendation. This work included over 20 engagements with academic experts, journalists, comedians, representatives of satirical publications, and advocates for freedom of expression. Stakeholders noted that humor and satire are highly subjective across people and cultures, underscoring the importance of human review by individuals with cultural context. Stakeholders also told us that “intent is key,” though it can be tough to assess. Further, true satire does not “punch down”: the target of humorous or satirical content is often an indicator of intent. And if content is simply derogatory, not layered, complex, or subversive, it is not satire. Indeed, humor can be an effective mode of communicating hateful ideas.
Given the context-specific nature of satire, we are not immediately able to scale this kind of assessment or additional consultation to our content moderators. We need time to assess the potential tradeoffs between identifying and escalating more content that may qualify for our satire exception, against prioritizing escalations for the highest severity policies, increasing the amount of content that would be escalated, and potentially slower review times among our content moderators.
Next steps: We are completing a new satire framework that regional and escalations teams will use to evaluate content for satire. We are assessing how to apply this review at scale. We plan to complete our assessment and update on our progress by the end of the year.

Recommendation 4 (assessing feasibility)
Facebook should let users indicate in their appeal that their content falls into one of the exceptions to the Hate Speech policy.
Our commitment: We will continue working on our appeal process to allow users to provide more specific information about their appeal, including that they believe it qualifies under one of the policy exceptions. As a result of this recommendation, we are evaluating how best to provide people with the ability to indicate that their content falls into one of the exceptions of the Hate Speech policy.
Considerations: We’re continuously working to improve our appeal process, both for the benefit of user experience and for accuracy of enforcement. We have been looking into ways to give people the ability to provide additional context with their appeal. Based on the board’s recommendation, we will now explore including the ability to identify a specific policy exception in the Community Standards that a user believes applies to their content.
There are operational challenges associated with increasing the amount of information our teams review as part of the appeals process. We need time to assess whether the additional context adds to the accuracy of review and quality of user experience. We also need to consider the extent to which additional context may slow down review time, limiting the number of appeals we can review at scale.
Next steps: We plan to complete our assessment and update on our progress by the end of the year.

Recommendation 5 (assessing feasibility)
Meta should ensure appeals based on policy exceptions are prioritized for human review.
Our commitment: We need time to assess including policy exceptions, as well as other user-provided context, as factors in how we prioritize appeals for human review.
Considerations: We currently prioritize appeals for human review based on a variety of factors, including how recently the post was taken down, how large the audience of the post is, and how confident we are that the initial decision was right. We are continuously working to improve our strategies for responding to appeals quickly and accurately. As described in our response to recommendation 4, we are exploring ways to give people the ability to provide additional information with their appeal, which we may be able to use to improve the quality of our appeals process.
Prioritizing a certain appeal may mean that it gets reviewed more quickly, but does not necessarily affect the accuracy of that review. We need time to analyze how changes to our appeals process and additional user-provided context affect both speed and accuracy of our scaled review.
Next steps: We plan to complete our assessment and update on our progress by the end of the year.
Meta
政策及信息公示平台
政策
政策执行
安全
功能
治理
研究工具
报告
中文(简体)