Policies that outline what is and isn't allowed on the Facebook app.
Policies that outline what is and isn't allowed on the Instagram app.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
How we keep our platforms safe from groups and individuals that promote violence, terrorism, organized crime, and hate.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
MAY 18, 2023
If content on Facebook doesn't violate the Facebook Community Standards, but might still be problematic or otherwise low-quality, Meta may reduce its distribution, consistent with user controls. This is one element of our broader "remove, reduce, inform" strategy that we've used since 2016.
Our goals for reducing problematic content are based on the needs of our community. This work is focused on various types of problematic content in Feed, including reducing problematic comments on public posts from Pages and people. Here’s how we approach this process, along with some categories of problematic content we reduce.
Through the Feed Preferences settings, people can increase the degree to which we demote much of this content so they see less of it in their Feed. Or if preferred, they can turn many of these demotions off entirely. They can also choose to maintain our current demotions.
We often ask people about their experiences on Facebook, and we use surveys to better inform what we build. We consider people’s responses, as well as signals such as what they like, dislike, comment on or share. Based on this feedback, here are some types of problematic content we address:
Low-quality content, such as clickbait and engagement bait.
Links to websites that are covered with ads, slow to load or broken.
Low-quality comments that are repeatedly copied and pasted.
We want people to have interesting, new material to engage with, so we work to set incentives that encourage the creation of high-quality content. And we disincentivize, for example:
Limited originality content that is principally repurposed from other sources.
Low-quality videos that abuse video or live video formats.
There is some content that individual people may want to see, but that others may find problematic, so we make it harder to encounter. This includes:
Content by creators that repeatedly violate our policies.
To help people discover new content and communities, we make personalized recommendations for people on Facebook and Instagram. Our Recommendations Guidelines serve as our baseline standards for the types of content we recommend.
Facebook may recommend content, accounts and entities—such as Pages, Groups or Events—that people don’t already follow. This includes:
Pages You May Like.
Groups You Should Join.
“Suggested For You” posts.
People You May Know.
Instagram may recommend accounts or content that people don’t already follow. This includes:
“Suggested For You” posts.
“Suggested For You” accounts.
Content on Reels and Explore.
Facebook may recommend content, accounts and entities—such as Pages, Groups or Events—that people don’t already follow. This includes:
Pages You May Like.
Groups You Should Join.
“Suggested For You” posts.
People You May Know.
Instagram may recommend accounts or content that people don’t already follow. This includes:
“Suggested For You” posts.
“Suggested For You” accounts.
Content on Reels and Explore.
Our Recommendations Guidelines are another important tool for managing problematic content on Facebook and Instagram. Because recommended content doesn’t come from accounts or entities people choose to follow, we work to avoid making recommendations that may be low-quality, objectionable, sensitive or inappropriate for younger viewers.