Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
JUN 12, 2023
2021-016-FB-FBR
Today, the Oversight Board selected a case referred by Meta regarding a post on Facebook from a Swedish journalist describing sexual violence against minors.
Upon initial review, after the post was reported in 2019, we found the content to be non-violating and it was left up. However, when the content was flagged again in 2021 further review determined the post did in fact violate Facebook’s policy on Child Sexual Exploitation and it was removed.
Facebook does not allow content that violates its policy on Child Sexual Exploitation, Abuse and Nudity. Under this policy, Meta removes content that, among other things, “shows children in a sexualized context.” Meta explained in their referral to the board that the post was in violation of this policy because it “describes how the attacker viewed the minor in sexually explicit terms.”
Meta referred this case to the board because we found it significant and difficult as it creates tension between our values of voice and safety.
We will implement the board’s decision once it has finished deliberating, and we will update this post accordingly. Please see the board’s website for the decision when they issue it.
We welcome the Oversight Board’s decision today on this case. Meta has acted to comply with the board’s decision immediately, and this content has been reinstated.
In accordance with the bylaws, we will also initiate a review of identical content with parallel context. If we determine that we have the technical and operational capacity to take action on that content as well, we will do so promptly. For more information, please see our Newsroom post about how we implement the board’s decisions.
After conducting a review of the recommendations provided by the board in addition to their decision, we will update this post.
Meta should define graphic depiction and sexualization in the Child Sexual Exploitation, Nudity and Abuse Community Standard. Meta should make clear that not all explicit language constitutes graphic depiction or sexualization and explain the difference between legal, clinical or medical terms and graphic content. Meta should also provide a clarification for distinguishing child sexual exploitation and reporting on child sexual exploitation. The Board will consider the recommendation implemented when language defining key terms and the distinction has been added to the Community Standards.
Our commitment: We will develop and publish a definition for graphic depictions and sexualization within our Community Standards for Child Sexual Exploitation, Nudity and Abuse. The definition will capture the distinction between violating content and non-violating content that may include descriptions in legal, clinical or medical contexts. We will also develop and publish clarifying guidelines for “depiction” and “reporting” in child sexual exploitation.
Considerations: As described within our Community Standards, we currently use warning labels when imagery is posted by a news agency that may depict child nudity in the context of famine, genocide, war crimes or crimes against humanity. We will further clarify that while reporting on child sexual exploitation is permitted, we do not allow graphic written descriptions. Based on the board’s recommendation, we will develop and publish clarifying guidelines for “depiction” and “reporting” in child sexual exploitation.
Next steps: We will begin developing definitions in response to this recommendation and aim to publish them before the end of 2022. We will provide updates on this process in future Quarterly Updates.
Meta should undergo a policy development process, including as a discussion in the Policy Forum, to determine whether and how to incorporate a prohibition on functional identification of child victims of sexual violence in its Community Standards. This process should include stakeholder and expert engagement on functional identification and the rights of the child. The Board will consider this recommendation implemented when Meta publishes the minutes of the Product Policy Forum where this is discussed.
Our commitment: We will conduct a policy development process on the functional identification of child victims of sexual violence.
Considerations: At the board’s recommendation, we will initiate a policy development process with a goal of ultimately bringing options for implementing this change to the Policy Forum. Consistent with past meetings of the Policy Forum, we intend to inform options for a potential change and development on functional identification of child victims of sexual violence with input from a range of external and internal perspectives and expertise. The policy development process may include conducting new research, consulting internal and external experts in areas such as freedom of expression and safety, and working with teams throughout Meta to understand the feasibility of implementing options. Following a discussion at the Policy Forum, we will post a summary of the proceedings in our Transparency Center.
Next steps: We are currently initiating the policy development process, with the goal of bringing the topic to a Policy Forum. We will provide further updates on this process in a future Quarterly Update.