Our approach to misinformation

UPDATED 7 APR 2025
To address misinformation, we use a combination of enforcement technology and human review, including the community of people on our platforms, local trusted partners and independent fact-checkers. In the United States, we have a community-based programme called Community Notes. We will continue to improve it over the course of the year before expansion to other countries (read more here). In certain countries outside the United States, we work with independent third-party fact-checkers.
Our strategy for misinformation: remove, reduce, inform
Remove
We value both free expression and keeping people safe, so we remove misinformation from Meta technologies in limited cases:
  • When misinformation is likely to directly contribute to the risk of imminent physical harm. For example, we remove certain false claims about health during public emergencies and about vaccines that leading health organisations have debunked. We do this to ensure safety.
  • When misinformation has the potential to interfere with or suppress voting. We do this because it undermines expression and voice.
Reduce
Outside the United States, when one of our fact-checking partners rates something as false, we may reduce the content's distribution in Feed and other places so that fewer people see it. When Facebook Pages, groups, profiles, websites and Instagram accounts repeatedly share such content, we may reduce the distribution of all of their posts in Feed and remove them from the recommendations that we show people.
Reducing the distribution of problematic content
Inform
In the United States, when content is potentially misleading or confusing, the community can add more context if they agree such information is helpful. Outside the United States, we apply notices to fact-checked posts and send notifications to the people who posted it. That way, people can see what the fact-checkers concluded and decide for themselves what to read, trust or share. (We no longer partner with US-based fact-checking organisations or show fact-checking labels to people in the United States.) We also partner with organisations around the world to promote news literacy.
Providing context on sensitive or misleading content
Community Notes
In the United States, Community Note contributors can write and submit a note to posts that they think are potentially misleading or confusing. A community note may include background information, a tip or an insight people might find useful. For a note to be published on a post, contributors who normally disagree, based on how they've rated notes in the past, will have to agree that a note is helpful. Notes will not be added to content when there is no agreement or when people agree that a note is not helpful.
Our fact-checking programme
Outside of the United States, Meta partners with third-party fact-checking organisations to address misinformation. This fact-checking programme focuses on identifying and addressing viral misinformation, particularly hoaxes with no clear basis in fact.
Our fact-checking partners prioritise provably false claims that are timely, trending and consequential in the countries and languages that they cover. They don't prioritise claims that are inconsequential or contain only minor inaccuracies.
The programme is also not intended to interfere with individual expression, opinions and debate, clearly satirical or humorous content or business disputes.
Content that fact-checkers prioritise
Fact-checkers
We work with more than 90 certified, third-party fact-checking organisations in more than 60 languages around the world. These fact-checkers are certified through the non-partisan International Fact-Checking Network (IFCN) or in Europe, the European Fact-Checking Standards Network (EFCSN).
How fact-checking works