Helping Teens See Age-Appropriate Content

UPDATED

NOV 18, 2024

We want teens to have safe, positive experiences on Facebook and Instagram, which includes helping them explore their interests while making sure they’re seeing content that’s appropriate for their age. We prevent teens from seeing content that’s sensitive or mature in three main ways. We remove content completely when it breaks our rules, hide sensitive or mature content from teens, and avoid recommending an even broader set of content.


We’re continuously evolving our approach to help make sure we’re providing teens with safe, age-appropriate experiences, and to incorporate the best possible research and expert advice to bring parents peace of mind.

Multiple Layers of Protection

❌ Content We Remove for Everyone

Our Community Standards outline the kinds of content we don’t allow on Facebook and Instagram. We remove this content completely for everyone - including teens - whenever we become aware of it. When we remove content that breaks these rules, we may also apply a strike to the account that shared it, and we disable accounts that repeatedly or severely violate our policies. These policies are designed to protect everyone in our community, including teens.

⚠️ Additional Content We Hide for Teens

Some content may be appropriate for adults but too mature for teens under 18. We’ve worked with experts and conducted research across countries to understand what types of content are inappropriate for teens to see, and we hide this content from them. This means, while adults still have access to this content, teens under 18 won’t be able to see or interact with it, even if it’s shared by an account they follow.

🛡️ Content We Avoid Recommending

In places where we suggest content, like Instagram’s Reels or Explore, we avoid recommending content that may be sensitive or objectionable, such as sexually suggestive content. We do this because we think there should be stricter standards when showing people content from accounts they haven’t chosen to follow. Our recommendation policies apply to everyone and, in some cases, are more restrictive for teens.

Here’s how we apply these protections

Here’s an example of how we apply these different layers of protection. Let’s take a look at our policies related to nudity or sexual activity.

❌ We remove images and videos containing nudity or explicit sexual activity, including when generated by AI. We make exceptions in some cases for medical, educational, artistic, and public interest content.

⚠️ For teens, we go further, hiding images and videos that don’t contain explicit nudity or sexual activity but could be considered sexually suggestive because of a sexual pose or see-through clothing. Teens can’t see this content even when posted by someone they know.

🛡️ We also avoid recommending that content to adults, and there are certain types of content - such as some implicitly sexual content - that we also don't recommend to teens.

These layers combine to help make sure teens are seeing content that’s appropriate for them.

Read about each layer of protection in more detail

Here’s more information about each layer of protection, with some examples of how they apply to different policies (though this isn’t a comprehensive list).

❌ Content We Remove for Everyone

The types of content we completely remove include, but are not limited to:

Abuse and Exploitation

We don’t allow content that can abuse or exploit people, like posts encouraging non-consensual sexual activity, or sharing or asking for child abuse material. For example, we would remove:

❌ A post offering prostitution or asking someone to send them pornography.

❌ A photo of someone in an intimate or sexual act, shared without their consent.

❌ A post offering or asking for child sexual abuse material or imagery of nude children.

Suicide, Self-Injury and Eating Disorders

We don’t allow content that encourages, glorifies or promotes suicide, self-injury or eating disorders, while still giving space people to talk about their own experiences and seek support. For example, we would remove:

❌ A post that speaks positively about suicide, self-injury, or eating disorders.

❌ A Story that shows graphic self-injury.

❌ A comment mocking someone for having an eating disorder.

Hate Speech, Bullying and Harassment

We don’t allow content that could create an environment of intimidation or exclusion, including bullying and harassment. For example, we would remove:

❌ A post using dehumanizing speech against people based on their race, religion, ethnicity, gender identity, or sexual orientation.

❌ A comment mocking the victims of sexual assault.

❌ A post that threatens to release personally identifiable information (like a passport number) or incites harassment towards someone.

Restricted Goods and Services

We don’t allow buying, selling, or trading certain restricted goods and services on our platforms. We also don’t allow people to promote certain types of substances or provide instructions on how to use them. For example, we would remove:

❌ A Story offering to buy, sell, or trade various types of drugs.

❌ A post looking to buy, sell, or trade tobacco, nicotine, or alcohol

❌ A comment offering 3D-printed gun parts.

We make some exceptions for legitimate businesses that legally offer certain kinds of restricted goods.

Threatening and Graphic Content

We don’t allow extreme graphic content, or content that may pose a threat to personal safety, such as threats of violence against people. For example, we would remove:

❌ A post threatening to kill or kidnap another person, or one encouraging others to commit violence.

❌ A graphic video showing a person being maimed or severely burned.

We make certain exceptions for graphic content in medical contexts or when shared to raise awareness. Where we do allow graphic content, we typically cover it with a warning screen to let people know the content may be sensitive before they click on it.

For full details on the content we remove, see our Community Standards.

⚠️ Additional Content We Hide for Teens

We’ve worked with youth experts around the world to understand what types of content may be appropriate for adults, but too mature for teens under 18 - and we hide this content from teens. This means, while adults will still have access to this content, teens won’t be able to see or interact with it, even if it’s shared by an account they follow in their Feed or Stories.

The types of content we allow for adults but hide from teens include, but are not limited to:

Suicide, Self-injury and Eating Disorders

We hide certain suicide and self-harm related content to protect teens from potentially distressing or sensitive material. For example, we would age restrict:

⚠️ A post where someone is describing their own personal experiences with suicide, self-injury or eating disorders, except in the context of recovery.

⚠️ A photo or video showing people in a hospital engaging in euthanasia or assisted suicide.

Restricted Goods and Services

We hide content from teens that could influence them to engage in activities that are potentially harmful. For example, we would age restrict:

⚠️ A post offering to sell tobacco, nicotine products, alcohol, or firearms when shared by a legitimate business.

⚠️ A Story encouraging people to take psychedelic drugs or cannabis products.

Threatening and Graphic Content

We hide most graphic and disturbing imagery from teens, even if we’d allow it behind a warning screen for adults. For example, we would age restrict:

⚠️ A photo of a severely burned person, which we’d cover with a warning screen for adults.

⚠️ A photo or video of shootings, explosions, or deadly car crashes.

For full details on the content we remove, see our Community Standards.

🛡️ Content We Avoid Recommending

We make recommendations in places like Instagram’s Reels or Explore to help people discover new content they may be interested in. We have guidelines about the kind of content that can be recommended, and avoid making recommendations that could be low-quality, objectionable, or particularly sensitive—even when the content isn’t severe enough to remove. This is because we think there should be stricter standards when showing people content from accounts they haven’t chosen to follow.

A lot of the content we avoid recommending to adults is already hidden completely from teens, but we go further for teens and avoid recommending additional types of content - like photos or videos that may be seen as implicitly sexual.

We also recognize that people have different levels of sensitivity, and they may want control over the kinds of content they’re recommended. Our recommendation controls – known as “Sensitive Content Control” on Instagram and “Manage Defaults” on Facebook – allow people to choose how much sensitive content they see in their content recommendations. With Instagram Teen Accounts, teens under 18 are defaulted into the strictest setting of our sensitive content control so that they’re even less likely to be recommended sensitive content – and teens under 16 can’t change this setting without a parent’s permission. With Facebook’s ‘Manage Defaults’, like Instagram, teens are defaulted into the strictest setting and can control which type of content is recommended to them by default.

How to Report Violating Content

We work hard to identify content that breaks our rules, and we find most of the content we remove proactively using our technology before it’s reported to us. If you see something we missed, please help make our platforms a safer place by reporting it on either Facebook or Instagram. All reports are anonymous.

Additional Resources

Our Community Standards contain the details on the kinds of content we remove and age-restrict for each policy area. To read them, click here.

We’ve also developed technology that proactively identifies potentially suspicious adults, such as an adult who has been repeatedly blocked or reported by teens, or if an adult repeatedly searches for violating content. We won’t recommend suspicious adults to teens, and we won’t recommend teens to suspicious adults. You can read more about this here.

You can read more about the ways we help keep our community safe on our Safety Center, and the ways we support teens and families on our Family Center.