Policies that outline what is and isn't allowed on the Facebook app.
Policies that outline what is and isn't allowed on the Instagram app.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
Change log
Change log
Current version
What: News articles on Facebook that describe suicide in celebratory or promotional ways per guidelines from major health organizations, as found on https://reportingonsuicide.org. While we don’t allow people to celebrate or promote self harm or suicide, we do allow people to discuss suicide and self-injury because we want Facebook to be a space where people can share their experiences, raise awareness about these issues, and seek support from one another. In some instances, we may restrict content to adults over the age of 18, include a sensitivity screen, and provide resources so that people are aware the content may be upsetting.
Why: We want Facebook to be a space where people can share their experiences, raise awareness about these issues, and seek support from one another, but we also want to prevent people from celebrating or promoting self harm or suicide. We work continuously with experts from around the world to strike a balance between important goals that are sometimes at odds. For example, when someone posts about self-harm, we want the person to be able to ask for help or share their path to recovery, but we must also consider the safety of the people who see that post. The post may unintentionally trigger thoughts of self-harm or suicide in others. We don't want people to share content that promotes self-harm, but we also don't want to shame or trigger the person who posted the content by removing their post.