Discriminatory Practices

Policy details

Change log

CHANGE LOG

Change log

Today

Current version

Discriminatory Practices

Ads must not discriminate or encourage discrimination against people based on personal attributes such as race, ethnicity, color, national origin, religion, age, sex, sexual orientation, gender identity, family status, disability, medical or genetic condition.

Meta prohibits advertisers from using our ads products to discriminate against people. This means that advertisers may not (1) use our audience selection tools to (a) wrongfully target specific groups of people for advertising (see advertising policy on Targeting), or (b) wrongfully exclude specific groups of people from seeing their ads; or (2) include discriminatory content in their ads. Advertisers are also required to comply with applicable laws that prohibit discrimination (see advertising policy on Illegal Products or Services). These include laws that prohibit discriminating against groups of people in connection with, for example, offers of housing, employment, and credit.

Any United States advertiser or advertiser targeting the United States, Canada or certain parts of Europe that is running credit, housing or employment ads, must self identify as a Special Ad Category, as it becomes available, and run such ads with approved targeting options.

Additional information and resources on United States non-discrimination laws:

U.S. Department of Housing and Urban Development
U.S. Equal Employment Opportunity Commission
Consumer Financial Protection Bureau
American Civil Liberties Union
Leadership Conferences on Civil and Human Rights
Department of Justice – Civil Rights Division
National Fair Housing Alliance

Disclaimer: This guide is not a substitute for legal advice. Consult a legal professional for specific advice about your situation.



Reporting
1
Universal entry point

We have an option to report, whether it's on a post, comment, story, message, profile or something else.

2
Get started

We help people report things that they don’t think should be on our platform.

3
Select a problem

We ask people to tell us more about what’s wrong. This helps us send the report to the right place.

4
Check your report

Make sure the details are correct before you click Submit. It’s important that the problem selected truly reflects what was posted.

5
Report submitted

After these steps, we submit the report. We also lay out what people should expect next.

6
More options

We remove things if they go against our Community Standards, but you can also Unfollow, Block or Unfriend to avoid seeing posts in future.

Post-report communication
1
Update via notifications

After we’ve reviewed the report, we’ll send the reporting user a notification.

2
More detail in the Support Inbox

We’ll share more details about our review decision in the Support Inbox. We’ll notify people that this information is there and send them a link to it.

3
Appeal option

If people think we got the decision wrong, they can request another review.

4
Post-appeal communication

We’ll send a final response after we’ve re-reviewed the content, again to the Support Inbox.

Takedown experience
1
Immediate notification

When someone posts something that doesn't follow our rules, we’ll tell them.

2
Additional context

We’ll also address common misperceptions and explain why we made the decision to enforce.

3
Policy Explanation

We’ll give people easy-to-understand explanations about the relevant rule.

4
Option for review

If people disagree with the decision, they can ask for another review and provide more information.

5
Final decision

We set expectations about what will happen after the review has been submitted.

Warning screens
1
Warning screens in context

We cover certain content in News Feed and other surfaces, so people can choose whether to see it.

2
More information

In this example, we give more context on why we’ve covered the photo with more context from independent fact-checkers

Enforcement

We have the same policies around the world, for everyone on Facebook.

Review teams

Our global team of over 15,000 reviewers work every day to keep people on Facebook safe.

Stakeholder engagement

Outside experts, academics, NGOs and policymakers help inform the Facebook Community Standards.