How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
2023-010-IG-MR
Today, the Oversight Board selected a case referred by Meta regarding an Instagram post describing a user’s experience with ketamine for the medical treatment of anxiety and depression by medical professionals. The user praises their experience, describing the altered mental state they entered as a result of the substance as well as their support for the use of psychedelics in treating mental health conditions.
Meta determined that this content did not violate our policy on Restricted Goods and Services, as laid out in our Instagram Community Guidelines and Facebook Community Standards and left the content up.
Meta referred this case to the board because we found it significant and difficult as it creates tension between our values of safety, voice and dignity.
While Meta typically does not allow content that “admits to personal use” or “promotes” non-medical drugs (the definition for which includes substances “used to achieve a high” or altered mental state), we allow for broader discussion as it relates to pharmaceutical drugs which are defined as “drugs that require a prescription or medical professionals to administer.” These two definitions in our policies can conflict when a drug is legally administered by medical professionals for treating mental illness in which an altered mental state can be a goal. In this instance, we determined the admission and promotion of ketamine use as a medically-administered pharmaceutical is allowed because it is in line with our overall policy of promoting discussion about medical treatment.
We will implement the board’s decision once it has finished deliberating, and we will update this post accordingly. Please see the board’s website for the decision when they issue it.
We welcome the Oversight Board’s decision today on this case. The board overturned Meta’s original decision to maintain the content on Instagram. Meta will act to comply with the board's decision and remove the content within 7 days.
After conducting a review of the recommendations provided by the board in addition to their decision, we will update this page.
Meta should clarify the meaning of the "paid partnership" labels in its Transparency Centre and Instagram's Help Centre. That includes explaining the role of business partners in the approval of "paid partnership" labels. The Board will consider this recommendation implemented when Meta's Branded Content policies have been updated to reflect these clarifications.
Our commitment: We provide specific instructions on tagging brand partners within our “paid partnership” label along with ways in which brand partners can approve such labels in the Facebook Business Center and Instagram Help Center. To increase its accessibility, we will update our Transparency Center to include this information as well.
Considerations: Currently, we provide general information about our Branded Content policies in the Transparency Center and provide further details about these policies (including specific instructions on how and when to add the “paid partnership” label to content and tag brands in such labels) in our Instagram Help Center and Facebook Business Center.
We are reviewing the existing spaces where we share this information and will ultimately provide information about our Branded Content policies in our Transparency Center. As part of this, we will include specific instructions on tagging brand partners in the “paid partnership” label as well as the ways in which brand partners can approve such labels.
Meta should clarify in the language of the Restricted Goods and Services Community Standard that content that "admits to using or promotes the use of pharmaceutical drugs" is allowed, even where that use may result in a "high" in the context of "supervised medical setting". Meta should also define what a "supervised medical setting" is and explain under the Restricted Goods and Services Community Standard that medical supervision can be demonstrated by indicators such as a direct mention of a medical diagnosis, a reference to the health service provider's license or to medical staff. The Board will consider this recommendation implemented when Meta's Restricted Goods and Services Community Standard has been updated to reflect these clarifications.
Our commitment: Our Community Standards generally focus on content we may remove as potentially harmful, including certain discussions of non-medical or pharmaceutical drugs. We allow broader discussion of pharmaceutical drugs, which we define as drugs that require a prescription or medical professionals to administer, because these types of discussions are often important to protect users' voice, ability to discuss their health challenges, and ability to share information that may help others. We do not require users to indicate in their posts that the use of these drugs occur in a "supervised medical setting." Requiring this level of detail could result in the removal of content where there is no indication the pharmaceutical drug has been misused and unnecessarily require users to share personal health information about themselves or others. If the overall content suggests that a pharmaceutical drug has been misused to achieve a high, we may treat it as a non-medical drug and remove the content.
Considerations: Our Restricted Goods and Services policy aims to encourage safety while allowing discussion about these goods and services, and we share details about instances where we may remove harmful content or restrict the visibility of content for minors in our Transparency Center. Our Community Standards define pharmaceutical drugs as “drugs that require a prescription or medical professionals to administer.” However, we recognize that pharmaceutical drugs can be used outside of their intended purposes, and incorporate this understanding into our definition of non-medical drugs: “drugs or substances that are not being used for an intended medical purpose or are used to achieve a high.” We do not allow admission of personal use, coordination, or promotion of non-medical drugs, including the misuse of pharmaceutical drugs. We consider a number of contextual factors to identify this type of content.
Our policy includes details about when we would remove content that attempts to buy, sell, trade or ask for pharmaceutical drugs in certain contexts. We allow users to discuss their use of pharmaceuticals given these medications have legitimate uses for health conditions, and we do not focus our definition of pharmaceuticals on whether or not a “high” is a potential side effect of use.
While we define “pharmaceutical drugs” as “drugs that require a prescription or medical professionals to administer,” we are concerned that requiring users to include specific language about a “supervised medical setting” to post content about these drugs that we would otherwise allow would unduly restrict speech on our platforms. Such a requirement could result in removal of legitimate speech by limiting people’s discussions of pharmaceutical drugs, even where there is no evidence of misuse, to instances where they provide details about a provider’s license, indicate the presence of medical staff, or discuss their medical diagnosis. For example, if someone regularly posts updates about a health condition on Facebook to share with friends and family and wishes to discuss a new treatment but fails to include details about medical supervision in their post, this policy change would remove that content. We also believe that some may interpret the phrase “supervised medical setting” to exclude taking prescribed pharmaceutical drugs at home or other places outside a medical office. We do not believe this is the intent of the Board’s decision. For these reasons, we believe that applying this recommendation would create unnecessary or disproportionate removal of important and helpful speech. We will have no further updates on this recommendation.
Meta should improve its review process to ensure that content created as part of a "paid partnership" is properly reviewed against all applicable policies (i.e. Community Standards and Branded Content policies), given that Meta does not currently review all branded content under the Branded Content policies. In particular, Meta should establish a pathway for at-scale content reviewers to route potentially violating Branded Content policies to Meta's specialist teams or automated systems that are able and trained to apply Meta's Branded Content policies when implicated. The Board will consider this implemented when Meta shares its improved review routing logic, showing how it allows for all relevant platform/content policies to be applied when there is a high likelihood of potential violation of any of the aforementioned policies.
Our commitment: As of August 2023, Meta launched infrastructure that sends all Instagram content with a “paid partnership” label for review against our Branded Content policies, regardless if the tagged brand partner has approved the use of the “paid partnership” label. Similarly, as of May 2023, Meta launched infrastructure that sends all Facebook content with a “paid partnership” label for review against our Branded Content policies. We now consider this recommendation complete and will have no further updates.
Considerations: Prior to the August 2023 launch, only “paid partnership” labeled Instagram content where the tagged brand partner approved the use of the label was sent for review against our Branded Content policies. In the ketamine case, the content was not sent for review against our Branded Content policies because the “paid partnership” label on the content had not been approved by the tagged brand partner.
Prior to the May 2023 launch, only a sample of “paid partnership” labeled Facebook content was sent for review against our Branded Content policies.
Content that is sent for review against our Branded Content policies is prioritized for review based on its total number of unique views along with machine learning automation that determines if the content (either in the post’s text or media) potentially violates the Branded Content policies. Thus, content that likely violates our Branded Content policies or has a wide reach is escalated for human review.
Given that all Instagram and Facebook content with a “paid partnership” label is now being sent for review against our Branded Content policies, Meta considers this recommendation fully implemented, as this has established a pathway for all content with a “paid partnership” label to be routed to Meta’s specialized teams or automated systems trained to apply Meta’s Branded Content policies.
Meta should audit the enforcement of policy lines from its Branded Content policies ("we prohibit the promotion of the following [...] 4. Drugs and drug-related products, including illegal or recreational drugs") and Restricted Goods and Services Community Standard ("do not post content that attempts to buy, sell, trade, co-ordinate the trade of, donate, gift or asks for non-medical drugs"). The Board finds that Meta has clear and defensible approaches that impose strong restrictions on the paid promotion of drugs (under its Branded Content policies) and attempts to buy, sell or trade drugs (under its Restricted Goods and Services Community Standard). However, the Board finds some indication that these policies could be inconsistently enforced. To clarify whether this is indeed the case, Meta should engage in an audit of how its Branded Content policies and its Restricted Goods and Services Standard are being enforced with regard to pharmaceutical and non-medical drugs. It should then close any gaps in enforcement. The Board will consider this implemented when Meta has shared the methodology and results of this audit and disclosed how it will close any gaps in enforcement revealed by that audit.
Our commitment: We have completed the launch of infrastructure on Facebook and Instagram that sends all content with a “paid partnership” label for review against our Branded Content policies and, in doing so, improved the overall performance of Branded Content enforcement. In H1 2024, we will assess the feasibility of conducting a more extensive audit of the Branded Content policies alongside the Restricted Good and Services policies.
Considerations: As discussed in Recommendation #3, in May and August 2023, we launched infrastructure that ensures all Instagram and Facebook content with a “paid partnership” label is sent for review against our Branded Content policies. Also in the first half of 2023, we launched improved machine learning automation that was shown to improve overall system performance for all Branded Content enforcement.
Regarding our enforcement systems that enforce on both proactively discovered content and reported content that may violate our Regulated Goods and Services Community Standards, we have Regulated Goods and Services classifiers dedicated to actioning content and, in scenarios where the classifiers aren’t confident, the content is sent for human review.
While we are still finalizing process updates following the May and August 2023 infrastructure updates, we believe that these will close any gaps in enforcement between the Branded Content policies and the Restricted Goods and Services Community Standards because, now, content with a “paid partnership” label that is either proactively or reactively sent for review against our Community Standards will also be reviewed separately against our Branded Content policies.
In H1 2024, we will assess the feasibility of conducting a more extensive audit of both the Branded Content and the Restricted Goods and Services policies to determine if branded content and content found to be potentially violating against the Restricted Goods and Services policies are enforced upon consistently and, if not, determine what we need to do to improve the consistency of our enforcement. We will provide an update in a future Quarterly Update.