Policy details

Change log

CHANGE LOG

Change log

Today

Current version

Jul 19, 2024

Misinformation

Ads Must Comply with the Community Standard on Misinformation.

Meta prohibits ads that include content debunked by third-party fact checkers. Advertisers that repeatedly post information deemed to be false may have restrictions placed on their ability to advertise across Meta technologies. Find out more about our fact-checking program.

Meta also prohibits ads that include misinformation that violates our Community Standards.

Overview

Advertisers can’t run ads that include content debunked by third-party fact checkers or that violates our Community Standards. At Meta, we’re committed to fighting the spread of misinformation across our technologies. While it’s impossible to eliminate misinformation from the internet entirely, we’re using research, teams and technologies to tackle it in the most effective and comprehensive way possible.

Guidelines

In addition to the requirements in our Community Standard on Misinformation, Ads cannot contain content debunked by third-party fact checkers.

Reporting
1
Universal entry point

We have an option to report, whether it's on a post, comment, story, message, profile or something else.

2
Get started

We help people report things that they don’t think should be on our platform.

3
Select a problem

We ask people to tell us more about what’s wrong. This helps us send the report to the right place.

4
Check your report

Make sure the details are correct before you click Submit. It’s important that the problem selected truly reflects what was posted.

5
Report submitted

After these steps, we submit the report. We also lay out what people should expect next.

6
More options

We remove things if they go against our Community Standards, but you can also Unfollow, Block or Unfriend to avoid seeing posts in future.

Post-report communication
1
Update via notifications

After we’ve reviewed the report, we’ll send the reporting user a notification.

2
More detail in the Support Inbox

We’ll share more details about our review decision in the Support Inbox. We’ll notify people that this information is there and send them a link to it.

3
Appeal option

If people think we got the decision wrong, they can request another review.

4
Post-appeal communication

We’ll send a final response after we’ve re-reviewed the content, again to the Support Inbox.

Takedown experience
1
Immediate notification

When someone posts something that doesn't follow our rules, we’ll tell them.

2
Additional context

We’ll also address common misperceptions and explain why we made the decision to enforce.

3
Policy Explanation

We’ll give people easy-to-understand explanations about the relevant rule.

4
Option for review

If people disagree with the decision, they can ask for another review and provide more information.

5
Final decision

We set expectations about what will happen after the review has been submitted.

Warning screens
1
Warning screens in context

We cover certain content in News Feed and other surfaces, so people can choose whether to see it.

2
More information

In this example, we give more context on why we’ve covered the photo with more context from independent fact-checkers

Enforcement

We have the same policies around the world, for everyone on Facebook.

Review teams

Our global team of over 15,000 reviewers work every day to keep people on Facebook safe.

Stakeholder engagement

Outside experts, academics, NGOs and policymakers help inform the Facebook Community Standards.