Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
The Oversight Board overturned Meta’s decision to remove a post on Instagram that featured images of nudity related to breast cancer. After the Board selected this case on Breast Cancer Symptoms & Nudity, Meta restored the content. Meta’s automated systems originally removed the post for violating the company’s Community Standard on Adult Nudity and Sexual Activity. The Board found that the post was allowed under a policy exception for “breast cancer awareness” and Meta’s automated moderation in this case raised important human rights concerns.
The Board issued several recommendations, including that Meta improve automated detection of images with text overlay so that posts raising awareness of breast cancer symptoms are not wrongly flagged for review. In response, Meta committed to refining these systems by continuing to invest in improving our computer vision signals, sampling more training data for our machine learning, and leveraging manual review when we’re not as confident about the accuracy of our automation.
What was the impact of Meta’s implementation of this recommendation?
In response to the Board’s recommendation in January 2021, Meta committed to improving text-overlay to ensure that posts raising awareness of breast cancer symptoms are not removed by over-enforcement of our Adult Nudity and Sexual Activity (ANSA) Community Standards. Meta’s implementation team enhanced Instagram’s techniques for identifying breast cancer context in content via text and deployed it in July 2021. Those enhancements have been in place since, and in one 30 day period alone (between Feb 25 - March 27, 2023), these enhancements contributed to an additional 2,500 pieces of content being sent for human review that would have previously been removed.