Our strategy for misinformation: remove, reduce, inform
We value both free expression and keeping people safe, so we remove misinformation from Meta technologies in limited cases:
- When misinformation is likely to directly contribute to the risk of imminent physical harm. For example, we remove certain false claims about health during public emergencies and about vaccines that leading health organizations have debunked. We do this to ensure safety.
- When misinformation has the potential to interfere with or suppress voting. We do this because it undermines expression and voice.
Outside the United States, when one of our fact-checking partners rates something as false, we may reduce the content’s distribution in Feed and other places so fewer people see it. When Facebook Pages, Groups, Profiles, websites, and Instagram accounts repeatedly share such content, we may reduce the distribution of all of their posts in Feed and remove them from the recommendations we show people.
Reducing the distribution of problematic content
In the United States, when content is potentially misleading or confusing, the community can add more context if they agree such information is helpful. Outside the United States, we apply notices to fact-checked posts and send notifications to the people who posted it. That way, people can see what the fact-checkers concluded and decide for themselves what to read, trust or share. (We no longer partner with US-based fact checking organizations or show fact checking labels to people in the United States.) We also partner with organizations around the world to promote news literacy.
Providing context on sensitive or misleading content
Community Notes
In the United States, Community Note contributors can write and submit a note to posts that they think are potentially misleading or confusing. A community note may include background information, a tip or an insight people might find useful. For a note to be published on a post, contributors who normally disagree, based on how they’ve rated notes in the past, will have to agree that a note is helpful. Notes will not be added to content when there is no agreement or when people agree a note is not helpful.
Our fact-checking program
Outside of the United States, Meta partners with third-party fact-checking organizations to address misinformation. This fact-checking program focuses on identifying and addressing viral misinformation, particularly hoaxes with no clear basis in fact.
Our fact-checking partners prioritize provably false claims that are timely, trending and consequential in the countries and languages they cover. They don’t prioritize claims that are inconsequential or contain only minor inaccuracies. The program is also not intended to interfere with individual expression, opinions and debate, clearly satirical or humorous content or business disputes.
Content fact-checkers prioritize