Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
JUN 12, 2023
2021-013-IG-UA
Today, the Oversight Board selected a case appealed by an Instagram user believed to be based in Brazil regarding a picture of a dark brown liquid in a jar and two bottles, described as Ayahuasca. Ayahuasca is a plant-based brew with psychoactive properties that has spiritual and ceremonial uses in some South American countries. In the text accompanying the picture the user discusses the benefits of Ayahuasca.
Facebook took down this content for violating our policy on regulated goods, as laid out in the Instagram Community Guidelines and Facebook Community Standards . We do not allow content related to “non-medical drugs” that “admits to personal use without acknowledgment of or reference to recovery, treatment, or other assistance to combat usage” or “coordinates or promotes (by which we mean speaks positively about, encourages the use of, or provides instructions to use or make) non-medical drugs.”
We will implement the board’s decision once it has finished deliberating and will update this post accordingly. Please see the board’s website for the decision when they issue it.
We welcome the Oversight Board’s decision today on this case. Meta has acted to comply with the board’s decision immediately, and this content has been reinstated.
In accordance with the bylaws, we will also initiate a review of identical content with parallel context. If we determine that we have the technical and operational capacity to take action on that content as well, we will do so promptly. For more information, please see our Newsroom post about how we implement the board’s decisions.
After conducting a review of the recommendations provided by the board in addition to their decision, we will update this post.
On January 7, 2022, Meta responded to the board’s recommendations for this case. We are fully implementing one recommendation, implementing one in part and and are assessing another's feasibility.
The board reiterates its recommendation that Meta should explain to users that it enforces the Facebook Community Standards on Instagram, with several specific exceptions. While Meta may take other actions to comply with the recommendations, the board recommends Meta update the Introduction to the Instagram Community Guidelines (“The Short” Community Guidelines) within 90 days to inform users that if content is considered violating on Facebook, it is also considered violating on Instagram, as stated in the company’s Transparency Center, with some exceptions.
Our commitment: We are publishing updates to the Instagram Community Guidelines so they match the Facebook Community Standards in all of the shared policy areas. In the small number of instances where the policies differ, we will make it clear.
Considerations: In almost all areas, Facebook and Instagram share the same policies. In a small number of instances, the policies differ between the two apps because of the ways the products differ. For example, on Facebook our policies require accounts to use the name the person goes by in everyday life. On Instagram, people can use names on their accounts for other purposes, such as for a pet dog, a hobby, or a small business.
We do not believe adding a short explanation to the Community Guidelines introduction will fully address the board’s recommendation, and may lead to further confusion. Instead, we are working to update the Instagram Community Guidelines so that they are consistent with the Facebook Community Standards in all of the shared policy areas. Where the policies do differ, we will make these differences clear. We will publish the updated Instagram Community Guidelines in the coming months.
Next steps: Our work to update the Instagram Community Guidelines is underway. We expect to publish these changes in the coming months, and will provide a status report in a future Quarterly Update.
The board reiterates its recommendation that Meta should explain to users precisely what rule in a content policy they have violated.
Our commitment: We will continue to identify and implement the best ways of providing transparency to people when we enforce our policies, in line with our previous commitments to similar board recommendations.
Considerations: The board’s recommendations about giving more specific notice to people when they violate our policies are part of our broader efforts to improve people’s experiences with content moderation on our platforms. We’re building improvements in a number of areas, including consistency across our products, increased transparency in our messaging and greater visibility into our processes. We’re initially investing in the technology required to help create consistency, to ensure that people always receive the intended product experience.
As we reported in our most recent Quarterly Update, the board’s recommendation in the Armenians in Azerbaijan case (2021-005-FB-UA) led to the launch of more specific messaging in English on Facebook when someone violates our hate speech policy. As we’ve previously described, people using Facebook in English now receive more specific messaging when they violate our Hate Speech Community Standards. We’ve begun testing versions of this messaging on Facebook in Arabic, Spanish and Portuguese, and are working to expand to Instagram as well. We have also begun experimenting with this type of specific messaging in instances of bullying & harassment policy violations, and are continuing to review other potential opportunities to help improve experiences.
Next steps: We will provide updates on this continued work in future Quarterly Updates with our updates to the recommendation from the Armenians in Azerbaijan case.
The board recommends that Meta modify the Instagram Community Guidelines and Facebook Regulated Goods Community Standard to allow positive discussion of traditional and religious uses of non-medical drugs where there is historic evidence of such use. The board also recommends that Meta make public all allowances, including existing allowances.
Our commitment: We are assessing the feasibility of this recommendation through our standard policy development process, including as a discussion in the Policy Forum. We are continuing to consider options for communicating policy allowances.
Considerations: Our restricted goods and services policy aims to deter potentially harmful activities and promote safety, while still allowing some discussion to advocate for changes to certain regulations of goods and services. This policy applies globally to people from various countries and cultural backgrounds. Before making any potential changes to this policy, we must weigh a recognition that certain substances may have cultural or religious significance for a community against the potential for harmful use. In order to understand how to best strike this balance on our platforms, we are planning to conduct robust policy development.
The policy development process includes a number of steps designed to create an informed set of options for consideration. As with previous policy development, we’ll first consult a range of external experts on this topic from a variety of backgrounds, regions, and perspectives. Additionally, we will conduct research surrounding this topic to inform this issue and potential policy changes. We will meet with relevant teams within Meta to solicit their feedback and inputs. For example, internal teams provide insight on the feasibility of enforcing our policies at scale for people around the globe, both by human review and by automation. Then, subject matter experts will present a recommendation at the Policy Forum for discussion.
You can learn more about how the Policy Forum works, and read the meeting minutes from prior Policy Forums, here.
In addition, we are exploring ways of communicating policy allowances more transparently.
Next steps: We are beginning the policy development process, and plan to provide updates on the status and progress of this development in future Quarterly Updates.