Providing context on sensitive or misleading content

UPDATED

MAR 13, 2025

One way Meta promotes a safe, authentic community is by informing people that content might be sensitive or misleading, even if it doesn’t explicitly violate the Community Standards. In this instance, we’ll include additional context about the content to help people decide what to read, trust or share.

How we provide context on content

By providing people with specific and relevant context when they come across a flagged post, we can help them be more informed about what they see and read. Here are some ways we provide context on relevant pieces of content that may be sensitive, misleading, or confusing:

Warning screens on sensitive content on Facebook, Instagram and Threads

Our goal is to protect people from viewing potentially sensitive content.

Facebook

People value the ability to discuss important and often difficult issues online, but they also have different sensitivities to certain kinds of content. Therefore, we include a warning screen over potentially sensitive content on Facebook, such as:

  • Violent or graphic imagery.

  • Posts that contain descriptions of bullying or harassment, if shared to raise awareness.

  • Some forms of nudity.

  • Posts related to suicide or suicide attempts.

Warning screens
1
Warning screens in context

We cover certain content in News Feed and other surfaces, so people can choose whether to see it.

2
More information

In this example, we give more context on why we’ve covered the photo with more context from independent fact-checkers

Instagram

To help people avoid coming across content they’d rather not see, we limit the visibility of certain posts that are flagged by people on Instagram for containing sensitive or graphic material. Photos and videos containing such content will appear with a warning screen to inform people about the content before they view it. This warning screen appears when viewing a post in feed or on someone's profile.

Verified badges on Facebook, Instagram, Messenger and Threads

Our goal is to help people feel confident about the content and accounts they interact with.

To combat impersonations and help people avoid scammers that pretend to be high-profile people, Meta provides verified badges on Pages and profiles that indicate a verified account. This means we’ve confirmed the authentic presence of the public figure, celebrity or global brand that the account represents.

Notification screens on outdated articles on the Facebook app

Our goal is to make it easier for people to identify content that’s timely, reliable and most valuable to them.

To give people more context about a news article before they share it on Facebook, Meta includes a notification screen if the article is more than 90 days old. After which, we allow people to continue sharing it if they desire. This notification helps people understand how old a given news article is, and its source.

To ensure we don’t slow the spread of credible information, especially in the health space, content posted by government health authorities and recognized global health organizations does not have this notification screen.

Community notes on potentially misleading or confusing content

Our goal is to draw on a broad range of voices that exist on our platform to decide what content is potentially misleading or confusing and could benefit from additional information.

In the United States, Community Note contributors can write and submit a note to posts that they think are potentially misleading or confusing. A community note may include background information, a tip or an insight people might find useful. For a note to be published on a post, users who normally disagree, based on how they’ve rated notes in the past, will have to agree that a note is helpful. Notes will not be added to content when there is no agreement or when people agree a note is not helpful.

Nearly anyone can sign up today for the opportunity to be a contributor to Community Notes. Click here for details. Though this program is only available in the United States right now, our intention is ultimately to roll out this new approach to our users all over the world.

Warning screens on sensitive content on Facebook, Instagram and Threads
Verified badges on Facebook, Instagram, Messenger and Threads
Notification screens on outdated articles on the Facebook app
Community notes on potentially misleading or confusing content

Our goal is to protect people from viewing potentially sensitive content.

Facebook

People value the ability to discuss important and often difficult issues online, but they also have different sensitivities to certain kinds of content. Therefore, we include a warning screen over potentially sensitive content on Facebook, such as:

  • Violent or graphic imagery.

  • Posts that contain descriptions of bullying or harassment, if shared to raise awareness.

  • Some forms of nudity.

  • Posts related to suicide or suicide attempts.

Warning screens
1
Warning screens in context

We cover certain content in News Feed and other surfaces, so people can choose whether to see it.

2
More information

In this example, we give more context on why we’ve covered the photo with more context from independent fact-checkers

Instagram

To help people avoid coming across content they’d rather not see, we limit the visibility of certain posts that are flagged by people on Instagram for containing sensitive or graphic material. Photos and videos containing such content will appear with a warning screen to inform people about the content before they view it. This warning screen appears when viewing a post in feed or on someone's profile.

Our goal is to help people feel confident about the content and accounts they interact with.

To combat impersonations and help people avoid scammers that pretend to be high-profile people, Meta provides verified badges on Pages and profiles that indicate a verified account. This means we’ve confirmed the authentic presence of the public figure, celebrity or global brand that the account represents.

Our goal is to make it easier for people to identify content that’s timely, reliable and most valuable to them.

To give people more context about a news article before they share it on Facebook, Meta includes a notification screen if the article is more than 90 days old. After which, we allow people to continue sharing it if they desire. This notification helps people understand how old a given news article is, and its source.

To ensure we don’t slow the spread of credible information, especially in the health space, content posted by government health authorities and recognized global health organizations does not have this notification screen.

Our goal is to draw on a broad range of voices that exist on our platform to decide what content is potentially misleading or confusing and could benefit from additional information.

In the United States, Community Note contributors can write and submit a note to posts that they think are potentially misleading or confusing. A community note may include background information, a tip or an insight people might find useful. For a note to be published on a post, users who normally disagree, based on how they’ve rated notes in the past, will have to agree that a note is helpful. Notes will not be added to content when there is no agreement or when people agree a note is not helpful.

Nearly anyone can sign up today for the opportunity to be a contributor to Community Notes. Click here for details. Though this program is only available in the United States right now, our intention is ultimately to roll out this new approach to our users all over the world.