Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
In February 2023, Meta consulted the Board on whether to continue removing content using the Arabic term "shaheed" (شهيد) when referencing a designated individual under the Dangerous Organizations and Individuals Community Standard. "Shaheed" is an honorific term with various meanings, including referring to those who die honorably or unexpectedly. In its policy advisory opinion the Board found Meta's approach overly broad, restricting free expression. The Board recommended that Meta should stop presuming that using "shaheed" to reference a designated individual is always violating, but rather that this content should be removed only when accompanied by signals of violence or other policy violations.
What was the impact of Meta’s implementation of this recommendation?
In response to the Board's recommendation, the Board published their independent evaluation of Meta’s approach to the Arabic term “shaheed” and how it was impacting the free expression of millions of users. After the policy change, the Board’s data team used the Meta Content Library data to identify an increase of 10% in content with the word “shaheed” on Facebook. This increase reflects the updated policy approach of allowing people to use the word "shaheed" when their content does not contain signals of violence and does not praise dangerous individuals or organizations, such as terrorists.