Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
Today, we’re publishing our second quarter reports for 2023, including the Oversight Board Quarterly Update, Widely Viewed Content Report, Community Standards Enforcement Report and the Adversarial Threat Report. All of these reports are available in our Transparency Center.
Some report highlights include:
Adversarial Threat Report
In our Q2 Adversarial Threat Report, we’re sharing findings about five separate covert influence operations we took action against under our Coordinated Inauthentic Behavior policy. In addition to three networks that originated in Türkiye and Iran targeting audiences in Türkiye, we are publishing detailed threat research about two of the largest known cross-internet operations – one from China (known as Spamouflage) and one from Russia (known as Doppelganger in the security community). To enable further research by the open source community, we’re sharing threat indicators for each campaign to help our industry raise our collective defenses. More details here.
The Oversight Board Quarterly Update
This quarter, we completed work on 24 recommendations from the Oversight Board, implementing 19 of those in full—meaning that we aligned fully with the board’s direction in each of those instances. This work is continuing to drive important changes to our policies, operations, and products.
In April the board published its third Policy Advisory Opinion (PAO), specific to Meta’s treatment of harmful health misinformation in the context of COVID-19. As we announced in our response, we will take a more tailored, localized approach to our Covid-19 misinformation rules consistent with the board’s guidance and our existing policies. Our Covid-19 misinformation rules will no longer be in effect globally as the global public health emergency declaration that triggered those rules has been lifted. Overall, we are implementing or have already implemented 17 of the board’s 23 recommendations from this PAO in full or in part.
This quarter we also completed implementation of three Oversight Board recommendations that will allow people to better explain why they disagree with Meta’s content moderation decisions. A new tool was developed that will give people the opportunity to explain whether their content was intended to raise awareness, as satire, or is regionally innocuous – all of which could potentially lead to different content moderation decisions. We expect it to be available for all Facebook and Instagram users worldwide by Q3 2023.
Other Integrity Updates:
In 2022, based on a recommendation from the Oversight Board, we published details on our Newsworthy allowance, including the total number of documented newsworthy allowances and the number of those allowances issued for posts by politicians. From June 2022 – June 2023, we documented 69 newsworthiness allowances of which nine (13%) were issued for posts by politicians. This updated data is now available in our Transparency Center.
Our community standards are a living set of guidelines, and it’s important that they keep pace with changes happening online and in the world, and that we constantly evaluate them to be sure we’re getting it right. We have heard feedback from experts, civil society groups, and users that our current Dangerous Organizations and Individuals policy all too often captures content such as news reporting, neutral discussion of current events or even condemnation of terrorist and hate groups. That’s why we’re updating the Dangerous Organizations and Individuals portion of our community standards to allow for more social and political discourse — such as news reporting, human rights related issues, or academic, neutral and condemning discussion. Content that praises or supports these groups or their violent actions or missions is still prohibited.
In July, the academic journals Science and Nature published four landmark research papers to better understand the impact of Facebook and Instagram on key political attitudes and behaviors during the US 2020 election cycle. The papers mark the first time Meta, or any technology company, has opened itself up so transparently to comprehensive, peer-reviewed academic research into its impact during an election. The research published in these papers won’t settle every debate about social media and democracy, but we hope and expect it will advance society’s understanding of these issues.
As part of our ongoing work to provide young people with safe, positive online experiences, we’re providing more transparency into our efforts to find and report child exploitation to the National Center for Missing and Exploited Children (NCMEC). You can read more here.
The European Union’s Digital Services Act
Meta has long advocated for a harmonized regulatory regime that effectively protects people’s rights online, while continuing to enable innovation. This week the European Union’s Digital Services Act (DSA) began to fully apply in the EU to Facebook, Instagram and a number of other tech platforms and services. From early on, we’ve been supportive of the objectives of the DSA and we have introduced a number of measures to respond to these new rules, adapting and evolving the existing safety and integrity systems we have in place in many of the areas regulated by the DSA.