Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
In 2022, the Oversight Board selected a case referred by Meta concerning a video on Instagram depicting a woman in India being sexually assaulted by a group of men. The Instagram account described itself as a platform for Dalit perspectives in India. As part of its decision, the Board recommended that Meta include an escalation-only policy in the Adult Sexual Exploitation Community Standard to allow depictions of non-consensual sexual touching where the content meets certain criteria that entail minimal risks for victims. These criteria include Meta judges that the content is shared to raise awareness, the victim is not identifiable, the content does not involve nudity or explicit sexual activity, and the content is not shared in a sensationalized context.
Meta referred the case to the Oversight Board due to the complexity of striking the appropriate balance between allowing users to freely express condemnation of sexual exploitation and addressing the harm in allowing graphic visual depictions of sexual harassment to remain on our platforms. By considering the context and intent behind user-generated content, we create a safer and more inclusive online environment that empowers users to express themselves freely.
What was the impact of implementing this recommendation?
In May 2023, following the Board’s recommendation and an internal assessment, Meta updated our Adult Sexual Exploitation policy to allow content depicting non consensual sexual touching with a warning screen and age-restriction, when the content met the context-specific policy criteria.
To illustrate the Board’s impact, we gathered data from our enforcement teams related to this new policy in combination with our existing policy on written descriptions of this form of harassment. Over a 3-month period from December 1, 2024, through February 28, 2025, we identified over 15,000 pieces of content across Facebook and Instagram where users raised awareness regarding sexual harassment or abuse in line with the allowable criteria from our Adult Sexual Exploitation policy. This includes content that would have previously been taken down for violating policy but, given the awareness-raising context, remains on the platform with a screen to give users the option to view more details. Of this total, 76% of the content was from Facebook and 24% of the content was from Instagram.
This data illustrates the impact of Meta's implementation of the Board's recommendation on our users across platforms. Through the responsible sharing of such sensitive content, users are able to raise awareness about critical issues like sexual harassment and abuse while maintaining user safety.