Policies that outline what is and isn't allowed on our apps.
Policies for ad content and business assets.
Other policies that apply to Meta technologies.
How we update our policies, measure results, work with others, and more.
Explore how we help teens have safe, positive experiences on Facebook and Instagram.
How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
2023-001-FB-UA
Today, the Oversight Board selected a case appealed by a Facebook user regarding a video and caption that call into question the Brazilian election of President Luiz Inácio Lula da Silva and includes calls to “siege” Brazil’s congress as “the last alternative.” The video depicts a speech by a Brazilian general in which he calls for people to “hit the streets” and “go to the National Congress... [and the] Supreme Court” followed by images including a fire raging in the Three Powers Plaza in Brasília (home to Brazil’s presidential offices, Congress, and the Supreme Court) as well as an image with the words “we demand the source code,” referring to a slogan used by protestors to question the reliability of Brazil’s electronic voting machines.
Upon initial review, we found the content to be non-violating and it was left up. However, upon further review, we determined the post was left up in error as it did in fact violate our policy on Violence and Incitement considering Brazil had been designated a Temporary High-Risk Location as laid out in the Facebook Community Standards. We therefore removed the content.
We will implement the board’s decision once it has finished deliberating, and we will update this post accordingly. Please see the board’s website for the decision when they issue it.
We welcome the Oversight Board’s decision today on this case. The board overturned Meta’s original decision to leave this content up. Meta previously removed this content so no further action will be taken on it.
In accordance with the bylaws, we will also initiate a review of identical content with parallel context. If we determine that we have the technical and operational capacity to take action on that content as well, we will do so promptly. For more information, please see our Newsroom post about how we implement the board’s decisions.
After conducting a review of the recommendations provided by the board in addition to their decision, we will update this page.
We will implement the board’s decision once it has finished deliberating, and we will update this post accordingly. Please see the board’s website for the decision when they issue it.
Meta should develop a framework for evaluating the company’s election integrity efforts. This includes creating and sharing metrics for successful election integrity efforts, including those related to Meta’s enforcement of its content policies and the company’s approach to ads. The Board will consider this recommendation implemented when Meta develops this framework (including a description of metrics and goals for those metrics), discloses it in the company’s Transparency Center, starts publishing country-specific reports, and publicly discloses any changes to its general election integrity efforts as a result of this evaluation.
Our commitment: While Meta currently has a variety of metrics and ongoing efforts to monitor critical events in real-time, we are committed to improving these efforts to better evaluate the success of our election integrity efforts and increase transparency about their impact.
Considerations: As shared in our 2022 Brazilian Election by the Numbers and 2022-Q2 Adversarial Threat reports, Meta has robust processes and systems in place to monitor ongoing critical events in targeted regions. Specifics on some of these processes (e.g. Integrity Product Operations Centers or IPOCs) can be found in our response to recommendation #2 as well as in our Newsroom. Additionally, our Business Help Center outlines Meta’s political ads transparency tools and efforts, along with guidance on our ad review framework accompanied by review examples across the world.
Our global enforcement teams deploy data monitoring as well as proactive and reactive crisis management tools during elections. We continue to publicly share our tangible undertakings towards elections. Our Global Operations support teams also conduct election post-mortems and archive insights to inform future election efforts. To this end, our teams have developed an internal process to monitor real-world data related to scheduled critical events like elections, and unscheduled critical events like protests, to guide how we scale our integrity responses.
As part of our on-going efforts, our teams are assessing the feasibility of consolidating these into core metrics that monitor our efforts during expected critical events throughout their lifecycle (prior, during, after) and exploring how we might be able to share more with the public. This might include exploring new baseline or additive metrics that build on existing frameworks. For example, with ongoing development and improvement of our internal processes during critical events, we are evaluating potential improvements for explainability of deployment, monitoring, and sunsetting during critical events. We expect to provide further updates in future Quarterly Reports.
Meta should clarify in its Transparency Center that, in addition to the Crisis Policy Protocol, the company runs other protocols in its attempt to prevent and address potential risk of harm arising in electoral contexts or other high-risk events. In addition to naming and describing those protocols, the company should also outline their objective, what the points of contact between these different protocols are, and how they differ from each other. The Board will consider this recommendation implemented when Meta publishes the information in its Transparency Center.
Our commitment: We will share more information about our various election integrity processes, protocols, and systems and how they all interrelate here. We will also explore opportunities to highlight this information elsewhere on the Transparency Center.
Considerations: Meta has regional teams across the company who are identifying and mitigating potential risks around the world throughout the year. Depending on the risk level of a given election, those teams will develop and implement dedicated mitigation plans months or even years prior to an election. We publicly describe our tools and systems for adapting to situations of heightened risk on our Transparency Center. This includes continuously monitoring impacts to our platforms and the people on them, in terms of both expression and safety, and resulting adjustments to our measures in response to any spikes or changes in the signals we’re tracking.
As part of Meta’s elections preparation and response work, a number of teams, including Human Rights, Civil Rights, Policy, Product, engineering and operations teams, identify election-related content trends and incorporate them into our content risk mitigation strategies. Prior to the election, our risk assessment processes gather information from many sources to identify potentially harmful trends. These sources include public reports, recommendations from our trusted partners, ongoing observations of content trends, and assessments from our intelligence teams. The results, among other factors, help to inform a number of product and policy mitigations, including designating places with upcoming high-risk elections Temporary High-Risk Locations (“THRL”), a designation used to identify markets in need of additional monitoring and support.
As described in our response to the case regarding a post calling for violence in Ethiopia (recommendation #2), Meta stands up Integrity Product Operations Centers (“IPOCs”), as needed, to bring together subject matter experts from across the company to respond in real time to potential problems and trends. For some planned events, such as elections in high-risk locations, Meta schedules IPOCs in advance. During a given election, we also monitor sources similar to those we used to conduct our risk assessment. We use, among other things, data analysis tools, inputs from trusted partners, and monitoring of traditional media. We also monitor trends in user reports and content flagged by our classifiers.
These IPOCs often work in conjunction with our Crisis Policy Protocol (“CPP”) to help us assess how to address content risks. The THRL designation process and CPP designation review process are separate, though they both draw on some of the same signals for heightened risk of violence or offline harm. While we may consider designating a THRL during a crisis, it is not contingent or dependent on a CPP designation. We hope this response provides useful public clarity on the various protocols Meta deploys to prevent and address potential harms that may arise in electoral or other high-risk contexts. We will continue to explore additional opportunities to share public updates about this work and will provide updates in future Quarterly Updates.