Meta

Meta
Policies
Community StandardsMeta Advertising StandardsOther policiesHow Meta improvesAge-Appropriate Content

Features
Our approach to dangerous organizations and individualsOur approach to the opioid epidemicOur approach to electionsOur approach to misinformationOur approach to newsworthy contentOur approach to Facebook Feed rankingOur approach to explaining rankingAccessibility at Meta

Research tools
Content Library and Content Library APIAd Library ToolsOther research tools and data catalogue

Enforcement
Detecting violationsTaking action

Governance
Governance innovationOversight Board overviewHow to appeal to the Oversight BoardOversight Board casesOversight Board recommendationsCreating the Oversight BoardOversight Board: Further asked questionsMeta’s Bi-Annual Updates on the Oversight BoardTracking the Oversight Board's Impact

Security
Threat disruptionsSecurity threatsThreat reporting

Reports
Community Standards Enforcement ReportIntellectual PropertyGovernment Requests for User DataContent Restrictions Based on Local LawInternet DisruptionsWidely Viewed Content ReportRegulatory and Other Transparency Reports

Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-Appropriate Content
Features
Our approach to dangerous organizations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library Tools
Other research tools and data catalogue
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta’s Bi-Annual Updates on the Oversight Board
Tracking the Oversight Board's Impact
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards Enforcement Report
Intellectual Property
Government Requests for User Data
Content Restrictions Based on Local Law
Internet Disruptions
Widely Viewed Content Report
Regulatory and Other Transparency Reports
Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-Appropriate Content
Features
Our approach to dangerous organizations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library Tools
Other research tools and data catalogue
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards Enforcement Report
Intellectual Property
Government Requests for User Data
Content Restrictions Based on Local Law
Internet Disruptions
Widely Viewed Content Report
Regulatory and Other Transparency Reports
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta’s Bi-Annual Updates on the Oversight Board
Tracking the Oversight Board's Impact
Policies
Community Standards
Meta Advertising Standards
Other policies
How Meta improves
Age-Appropriate Content
Features
Our approach to dangerous organizations and individuals
Our approach to the opioid epidemic
Our approach to elections
Our approach to misinformation
Our approach to newsworthy content
Our approach to Facebook Feed ranking
Our approach to explaining ranking
Accessibility at Meta
Research tools
Content Library and Content Library API
Ad Library Tools
Other research tools and data catalogue
Enforcement
Detecting violations
Taking action
Governance
Governance innovation
Oversight Board overview
How to appeal to the Oversight Board
Oversight Board cases
Oversight Board recommendations
Creating the Oversight Board
Oversight Board: Further asked questions
Meta’s Bi-Annual Updates on the Oversight Board
Tracking the Oversight Board's Impact
Security
Threat disruptions
Security threats
Threat reporting
Reports
Community Standards Enforcement Report
Intellectual Property
Government Requests for User Data
Content Restrictions Based on Local Law
Internet Disruptions
Widely Viewed Content Report
Regulatory and Other Transparency Reports
English (US)
Privacy PolicyTerms of ServiceCookies

Content Restrictions
How we assess reports of content violating local law

Overview
Restrictions by country
Global restrictions
Case studies
How we assess reports of content violating local law

Home
Reports
Content Restrictions Based on Local Law
Content violating local law

How we assess reports of content violating local law

When governments believe content on Facebook, Instagram, or Threads goes against local law, they may ask us to restrict the content. We may also receive court orders to restrict content or reports alleging content is unlawful from non-government entities and members of the public. We review these requests in line with our commitments as a member of the Global Network Initiative and Corporate Human Rights Policy.

Frequently asked questions


What is a request to restrict content based on local law?

When government authorities believe that something on Facebook, Instagram, or Threads goes against local law, they may contact Meta and ask us to remove the content. Similarly, we may receive takedown requests to restrict content from courts or reports alleging content is unlawful from members of the public.

How does Meta review reports of content alleged to violate local law?

We have a robust process for reviewing reports alleging that content on Facebook, Instagram, or Threads goes against local law.
When we receive a report, we first review it against the Community Standards. If we determine that the content goes against our policies, we remove it. If content does not go against our policies, in line with our commitments as a member of the Global Network Initiative and our Corporate Human Rights Policy, we conduct a careful legal review to confirm whether the report is valid, as well as human rights due diligence.
In cases where we believe that reports are not legally valid, are overly broad, or are inconsistent with international human rights standards, we may request clarification or take no action.
Where we do act against organic content on the basis of local law rather than our Community Standards, we restrict access to the content only in the jurisdiction where it is alleged to be unlawful and do not impose any other penalties or feature restrictions. We also notify the affected user.
When we act against ads or Commerce content (such as Marketplace posts), we remove the content globally pursuant to our Advertising Policies and Commerce Policies, respectively. These items are not currently included in this report.

What is the life cycle of a government takedown request?

The image below depicts the life cycle of a government takedown request:
example
Note: We may be compelled to deviate from the lifecycle outlined above during emergency situations. While uncommon, occasionally a country’s law may obligate us to automatically restrict access to content, at scale in specific countries, based on local law requirements. In such cases, we will continue to look to our Global Network Initiative commitments and Corporate Human Rights Policy to guide our approach.
INCOMING NOTICE FROM REGULATOR or RELEVANT GOVERNMENT AUTHORITY
When regulators and relevant government authorities believe content on Facebook, Instagram, or Threads goes against local law, they may ask us to restrict the content. We may also receive takedown requests to restrict content from courts.
CONTENT POLICY REVIEW
The content will first pass to our teams to review under our policies (eg: Community Standards, Advertising Policy, etc). If the content goes against those policies the appropriate action will be taken (eg: the content will be removed or age-gated, etc). If the content is removed globally the process ends at this stage and there is no need to proceed to the Legal Review or Human Rights Due Diligence Assessment. A response is issued to the relevant authority to inform them of the action taken, and impacted users are notified of the action taken.
LEGAL REVIEW
If the content does not go against our policies it is passed to our Legal team for an independent legal review. The team will consider whether the request is procedurally valid and whether the content goes against local law. The request may be rejected at this stage and, if so, a response is issued to the government authority to inform them that no action was taken.
HUMAN RIGHTS DUE DILIGENCE ASSESSMENT
If the report is not rejected at the legal review stage, it passes for a human rights due diligence assessment. We review all requests in line with our company principles and our commitments as a member of the Global Network Initiative (GNI) and our Corporate Human Rights Policy. In line with our Corporate Human Rights Policy, we recognize the diversity of laws in the locations where we operate, and where people use our products. We strive to respect domestic laws. When faced with conflicts between domestic legal obligations and our human rights and transparency commitments, we will seek to honor the principles of internationally recognized human rights to the greatest extent possible. In these circumstances we seek to promote international human rights standards by engaging with governments, and by collaborating with other stakeholders and companies.
The due diligence assessment takes the form of a 5 part test: Legality, Legitimacy, Necessity, Proportionality, and External Risks. After review, we may restrict the content’s availability in the country where it is alleged to be unlawful.
Legality:
In addition to the assessment above under local law, we consider the extent to which the applicable law respects international human rights standards. This includes whether the law respects the principles of the rule of law, if it provides sufficient legal certainty, and protects people against arbitrariness, including through sufficient procedural safeguards as appropriate.
Legitimacy:
We assess whether the purpose of the takedown order corresponds to one of the legitimate aims listed under international human rights law. The legitimate aims listed in international human rights law include “respect for the rights of others, protection of national security, public order, public health or morals”.
Necessity:
In essence this means we ask can the restriction on free expression be justified? Any restriction on content must be necessary in a democratic society and compatible with human rights principles, our company principles and our commitments as a member of the GNI. In applying the necessity test to content, we note it has consistently been held that under the right to freedom of expression “there is little scope for restrictions on political speech or on the debate of questions of public interest”.
As part of this process, we therefore carry out a public interest assessment to evaluate whether the content at issue forms part of the wider public agenda, relates to sensitive or volatile political events or issues, concerns public figures or parties, and/or has received major news coverage. We will also consider the reach, engagement, virality and age of the content. In order to determine what it means to restrict this content, we also consider the relevant political structure of government and press and information freedom. In light of these factors, we assess whether the takedown request appears justified under international human rights law and consider our enforcement options.
Proportionality:
According to international human rights law, any restriction on content must be implemented by the least restrictive means. We assess the request and our enforcement options to ensure any action taken on the content is done in the most narrow and specific way possible taking into account the product, tooling capabilities, and any temporal nature of the legal obligation.
External risks:
During the course of the assessment, many internal teams collaborate to understand whether there are any external risks associated with the recommended course of action.
External risks can include, but are not limited to, salient human rights concerns, the risk of blocking or throttling of Meta’s technologies, penalties, regulatory actions, criminal proceedings/arrests of employees or users, safety, and offline harm risks to people.
TAKE ACTION AND REPLY
If we take action on the content based on a legal request, we will then consider whether any mitigation measures may be deployed, such as an appeal by Meta of the takedown request, real time transparency via a case study in this Transparency Center, sending a copy of the order to Lumen, or taking steps to limit the impact of the request.
Where we do act against organic content on the basis of local law rather than our Community Standards, we restrict access to the content only in the jurisdiction where it is alleged to be unlawful and do not impose any other penalties or feature restrictions. A response is issued to the government authority and impacted user to inform them of the action taken.
When we act against ads or Commerce content (such as Marketplace posts), we remove the content globally pursuant to our Advertising Policies and Commerce Policies, respectively. These items are not currently included in this report.
If you have received a notice from us in your Support Inbox regarding content that we have restricted based on local law and wish to appeal the restriction, please contact the regulator to check if you can appeal.
Notice to users who encounter content restricted due to a government takedown request.
example
Notice to users whose content is restricted due to a government takedown request.
example

Does Meta notify people when it restricts content based on local law?

Yes. We tell people when we restrict something they posted based on a report that the content goes against local law, and we also tell people when they try to view something that is restricted in their country on the basis of a government takedown request. In the majority of cases, the notification also informs people as to which government authority sent the take down request resulting in the restriction. We provide this notice except in limited instances where we are explicitly prohibited by applicable law from doing so.

Do you include information about government reports of content that violates the Community Standards?

No. This report only includes content that was restricted in specific jurisdictions based on local law. When we receive reports of content that violates our policies, we remove it entirely from our platform, regardless of the source of the report. You can learn more about our enforcement of the Community Standards in our Community Standards Enforcement Report.

Does this report include content restrictions based on local law for all Meta products?

This report reflects content restricted on the basis of local law on Facebook, Instagram, and Threads.

How does Meta decide when to share copies of government takedown requests with the public?

Meta uses five principles to guide our approach to sharing copies of government takedown requests publicly, including via the Lumen Database:
ONE: Maximize Transparency
In the interest of informing public discourse, facilitating research and journalism, and enabling people to hold their governments accountable, we will endeavor to publish copies of government and court takedown requests received by Meta, except where doing so would contravene one of the other principles below. Each published order should be accompanied by a summary of the actions Meta took in response.
We will seek to prioritize based on the criteria of the UN Guiding Principles on Business and Human Rights of scale, severity and remediability.
TWO: Safety First
We will not publish takedown requests or information contained within them if doing so is likely to compromise the safety of any person (including any person who uses our services, any member of the public, or any Meta employee). Whenever possible, we will aim to publish takedown requests containing information that may pose a safety risk after redaction of all such information.
THREE: Respect Privacy
Prior to sharing with the Lumen project or publishing copies of takedown requests, we will redact takedown requests to ensure that they do not contain information on either a Meta user or the individual reporter of the content (e.g. names, personal email addresses, street addresses, phone numbers, etc.) and that only public URLs are shared.
FOUR: Prioritize Public Interest
We plan to publish all takedown requests that meet the rest of these principles in a timely manner and will expedite the publication of those with the highest public interest. In doing so will consider whether the content at issue: forms part of the wider public agenda, relates to sensitive or volatile political events or issues, is about or authored by political figures or parties, has received major news coverage, is from an entity of public prominence, eg: a news source or verified page of a public figure. We will also consider the reach, engagement, virality and age of the content.
FIVE: Respect Legal Obligations
We strive to respect the law in countries where we operate and publish bi-annual Transparency Reports since 2013 because we strive to be open and proactive in the way we safeguard users’ privacy, security and access to information online. In some instances, we may be legally prohibited from publishing takedown requests or certain information contained within them. In such cases, we will aim to publish as much information as possible about the order and its existence.
In line with our Corporate Human Rights Policy, when faced with conflicts between domestic legal obligations and our human rights and transparency commitments, we will seek to honor the principles of internationally recognized human rights to the greatest extent possible. In these circumstances we seek to promote international human rights standards by engaging with governments, and by collaborating with other stakeholders and companies.
Meta
Transparency Center
Policies
Enforcement
Security
Features
Governance
Research tools
Reports
English (US)