How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
JUN 12, 2023
2022-012-IG-MR
Today, the Oversight Board selected a case, which was referred by Meta, about Meta’s decision to leave up a video depicting a woman in India being harassed by a group of men. The text accompanying the Instagram post states that a “tribal woman” was sexually assaulted and harassed by a group of men in public. The Instagram account on which the video was posted describes its goal as “sharing stories from a Dalit’s desk.”
Meta initially took the content down for violating our policy against Adult Sexual Exploitation. Upon further review, however, we decided that the post was newsworthy and reinstated it for users over 18 with a warning screen.
Meta referred this case to the board because we found it significant and difficult as it creates tension between our values of safety and voice.
Meta usually removes content “that depicts, threatens or promotes sexual violence, sexual assault or sexual exploitation.” However, this case demonstrates the challenge in striking the appropriate balance between allowing content that condemns sexual exploitation and the harm in allowing visual depictions of sexual harassment to remain on our platforms.
We will implement the board’s decision once it has finished deliberating, and we will update this post accordingly. Please see the board’s website for the decision when they issue it.
We welcome the Oversight Board’s decision today to uphold our decision on this case. Meta previously reinstated this content as we decided that the post was newsworthy, so no further action will be taken on this content.
After conducting a review of the recommendations provided by the board in addition to their decision, we will update this page.
Meta should include an exception to the Adult Sexual Exploitation Community Standard for depictions of non-consensual sexual touching, where, based on a contextual analysis, Meta judges that the content is shared to raise awareness, the victim is not identifiable, the content does not involve nudity and is not shared in a sensationalized context, thus entailing minimal risks of harm for the victim. This exception should be applied at escalation only. The Board will consider this recommendation implemented when the text of the Adult Sexual Exploitation Community Standard has been changed.
Our commitment: We will update the Adult Sexual Exploitation policy in our Community Standards to allow depictions of non-consensual sexual touching with a warning screen where the content is shared to raise awareness, the victim is not identifiable, the content does not involve overt nudity and explicit sexual activity, and it is not shared in a sensationalized context. This will be context-specific and applied on escalation only.
Considerations: Our Adult Sexual Exploitation policy prioritizes Meta’s values of safety and voice. The policy also takes into account the dignity of victims and survivors of adult sexual exploitation. Under this policy, we consider the criteria outlined in this recommendation when deciding whether to apply newsworthy allowances on escalation. We will now publicly add a context-specific, escalation-only allowance to our Adult Sexual Exploitation policy in response to the Board’s recommendation. It will allow depictions of non-consensual sexual touching where the content is shared to raise awareness in a non-sensationalized context, the victim(s) is/are not identifiable, and the content does not include overt nudity or explicit sexual activity. We will apply a warning screen to this content so that users can make an informed choice about whether to view it. We will share updates on our progress in future Quarterly Updates.
Meta should update its internal guidance to at-scale reviewers on when to escalate content reviewed under the Adult Sexual Exploitation Community Standard, including guidance to escalate content depicting non-consensual sexual touching, with the above policy exception. The Board will consider this recommendation implemented when Meta shares with the Board the updated guidance to at-scale reviewers.
Our commitment: We will update the internal operational guidelines our at scale reviewers use to escalate content based on the Adult Sexual Exploitation policy updates we develop in response to Recommendation #1.
Considerations: As with all updates to our Community Standards, it will take time to develop policy language and finalize our approach to Recommendation #1. We will work with our enforcement teams to understand how changes to our Adult Sexual Exploitation policy can be operationalized and reflect those changes in the internal operational guidelines we provide to our at scale reviewers. We will share updates on our progress in future Quarterly Updates.