How we approach dangerous organizations and individuals.
How we support communities in the face of the opioid epidemic.
How we help prevent interference, empower people to vote and more.
How we work with independent fact-checkers, and more, to identify and take action on misinformation.
How we assess content for newsworthiness.
How we reduce problematic content in News Feed.
How we build AI systems.
Comprehensive access to public data from Facebook and Instagram
Comprehensive and searchable database of all ads currently running across Meta technologies
Additional tools for in-depth research on Meta technologies and programs
Quarterly report on how well we're doing at enforcing our policies on the Facebook app and Instagram.
Report on how well we're helping people protect their intellectual property.
Report on government request for people's data.
Report on when we restrict content that's reported to us as violating local law.
Report on intentional internet restrictions that limit people's ability to access the internet.
Quarterly report on what people see on Facebook, including the content that receives the widest distribution during the quarter.
Download current and past regulatory reports for Facebook and Instagram.
Today, the Oversight Board selected a case referred by Meta that is a policy advisory opinion regarding how we handle the sharing of private residential information.
Facebook does not allow users to post certain private information, including information on residential addresses, which violates our policy on Privacy Violations and Image Privacy Rights as laid out in our Community Standards.
Meta has requested guidance on this policy from the board because we found it significant and difficult as it creates tension between our values of voice and safety. Access to residential addresses can be an important tool for journalism, civic activism, and other public discourse. However, exposing this information without consent can also create a risk to an individual’s safety and infringe on privacy.
In this policy advisory opinion referral, Meta is asking for guidance on the following questions:
What information sources should render private information “publicly available”? (for instance, should we factor into our decision whether an image of a residence was already published by another publication?)
Should sources be excluded when they are not easily accessible or trustworthy (such as data aggregator websites, the dark web, or public records that cannot be digitally accessed from a remote location)?
If some sources should be excluded, how should Meta determine the type of sources that won’t be considered in making private information “publicly available”?
If an individual’s private information is simultaneously posted to multiple places, including Facebook, should Facebook continue to treat it as private information or treat it as publicly available information?
Should Facebook remove personal information despite its public availability, for example in news media, government records, or the dark web? That is, does the availability on Facebook of publicly available but personal information create a heightened safety risk that compels Facebook to remove the information, which may include removing news articles that publish such information or individual posts of publicly available government records?
Once the board has finished deliberating, we will consider and publicly respond to its recommendations within 30 days, and will update this post accordingly. Please see the board’s website for the recommendations when they issue them.
We welcome the Oversight Board’s decision today on this policy advisory opinion referral.
After conducting a review of the recommendations provided by the board, we will update this post.
Meta should remove the exception that allows the sharing of private residential information (both images that currently fulfill the Privacy Violations policy’s criteria for takedown and addresses) when considered “publicly available”. The Board will consider this implemented when Meta modifies its Internal Implementation Standards and its content policies.
Our commitment: We will remove the existing “publicly available” exception to the Privacy Violations policy, while also accounting for the proposed changes in the third recommendation of this policy advisory opinion (PAO).
Considerations: As the board notes in this recommendation, removing the exception for “publicly available” private residential information may limit the availability of this information on Facebook and Instagram when it is still publicly available elsewhere. However, we recognize that implementing this recommendation can strengthen privacy protections on our platforms. We will fully implement this recommendation, as well as the board’s recommendation that we allow the sharing of imagery that displays the external view of private residences in various scenarios, but not when there is a context of organizing protests against the resident (see recommendation 3, below).
Next steps: We will provide updates on the status of implementing these recommendations in future Quarterly Updates. We anticipate implementing this recommendation and recommendation 3 together, by the end of the year.
Meta should develop and publicize clear criteria for content reviewers to escalate for additional review of public interest content that potentially violates the Community Standards but may be eligible for the newsworthiness exception, as previously recommended in case decision 2021-010-FB-UA. The Board will consider this implemented when Meta publicly shares these escalation criteria.
Our commitment: Our teams will continue to assess potential newsworthy allowances by applying the balancing test we have published in the Transparency Center. This approach requires discretionary decision-making of the kind that could introduce bias or inconsistency if applied at scale. For this reason, it is only used by our specialized teams that review a limited set of the most difficult content decisions that have been escalated to them, often as a result of contextual circumstances. Determinations made this way may result in an allowance for a single piece of content, or in a broader allowance communicated through new guidance that we distribute to reviewers at scale.
Considerations: Our content reviewers must be able to act on our policies quickly, consistently and globally. In the Introduction to the Community Standards, we explain the distinction between scaled policies and policies where we require additional information and/or context to enforce. Scaled policies are written to give our reviewers clear guidance in order to minimize subjectivity in decision-making and promote consistent outcomes. By contrast, we enable specialized teams to take a more nuanced approach to the subset of our policies that require additional context.
Because of the subjective nature of newsworthiness allowances, we rely on specialized teams to conduct these newsworthy balancing tests to reduce bias and subjectivity from the analysis. However, the training we provide to at-scale content reviewers makes it clear they can escalate content they identify as potentially newsworthy to our specialized teams for further assessment. They do this using their regional and cultural knowledge to assess whether the content may be in the public interest and should be evaluated under the balancing test laid out in the Transparency Center.
Once content is escalated for additional review, specialized teams with policy expertise and regional knowledge weigh the public interest value against the risk of harm of keeping the piece of content on the platform, in line with the balancing test we describe in the Transparency Center. Our approach to determining the potential newsworthiness of a piece of content is rooted in international human rights standards, and involves conducting a balancing test that weighs the public interest against the risk of harm. We outline the contextual factors we use in making this determination in the Transparency Center, including political issues and the extent to which there is a free press.
We believe this is the right way to balance consistency at scale on the one hand, with nuance in more sensitive, public-interest cases on the other.
In line with our commitment to recommendation #2 in the Post Depicting Protests in Colombia While Using a Slur case (2021-010-FB-UA), we will publish more information to the newsworthiness page of the Transparency Center later this year to provide additional information about our approach to determining newsworthiness.
Next steps: We will have no further updates on this recommendation.
Meta should allow the sharing of “imagery that displays the external view of private residences” when the property depicted is the focus of the news story, except when shared in the context of organizing protests against the resident. The Board will consider this implemented when Meta modifies its content policies.
Our commitment: Implementing fully
Considerations: We will track future progress on this work as part of our updates to recommendation #1.
Next steps: We will track future progress on this work as part of our updates to recommendation #1.
Meta should allow the publication of addresses and imagery of official residences provided to high-ranking government officials. The Board will consider this implemented when Meta modifies its content policies.
Our commitment: We will allow for the organization of protests at publicly owned official residences on Facebook and Instagram, in cases where we can accurately identify these locations.
Considerations: Determining a definition of “high ranking officials” and applying it in countries and communities around the world is complex, particularly for local governments. For example, in the United States, positions such as “mayor of a city” may or may not come with a publicly-owned official residence. A city such as New York or Los Angeles may offer a residence, but in many other places, mayors and local officials live in private residences.
To ensure internal consistency, we will use our existing guidance from other policy areas to inform how to determine if residences are “publicly owned official residences.” This guidance includes globally recognized official positions that are more likely to reside in publicly owned official residences, such as ambassadors and heads of state.
As the board noted in its decision, private residences may not have the same security measures in place as a public residence. In line with the board’s guidance, the implementation of this recommendation will not include private residences of government officials.
Next steps: We will provide updates in future Quarterly Updates on the implementation of this recommendation. To begin our implementation, we will modify our internal guidance and update our training materials for reviewers.
Meta should allow the resharing of private residential addresses when posted by the affected user themselves or when the user consented to its publication. Users should not be presumed to consent to private information posted by others. The Board will consider this implemented when Meta modifies its content policies.
Our commitment: We will continue to allow people to share their own private residential addresses consistent with our Privacy Violations policy. Unfortunately, it’s often impossible to know whether a resident has consented to allowing another person to share their private address and, as the board suggests, we never assume that a resident has consented. Because of this, we will not adopt this part of the recommendation.
Considerations: Our existing policy allows people to share their own or another’s private residential addresses in certain situations, including the promotion of charitable causes, contacting business service providers or finding missing people, animals or objects. If we expanded this exception to all situations at scale, we would not be able to determine quickly or accurately whether the affected parties had consented to the information being shared, increasing the risk of privacy violations. In line with the board’s recommendation that we not presume that people consent to their private information being posted by others, we will leave this part of the policy unchanged.
Next steps: We will have no further updates on this recommendation.
Users should have a quick and effective mechanism to request the removal of private information posted by others. We will consider this implemented when Meta demonstrates in its transparency reports that user requests to remove their information are consistently and promptly actioned.
Our commitment: As a result of this recommendation, we are running an experiment to determine if changes to how people report content they think violates our Privacy Violation policy impacts the quality and volume of reports.
Considerations: As a result of this recommendation, we are testing new ways to make it easier for people to report content they think violates the Privacy Violations policy. Currently, people need to click through two menus and search for the term “Privacy Violation” before they can report a potential privacy violation. The experiment makes the “Privacy Violation” reporting option more prominent. A person will no longer need to search for it. We are interested in seeing whether this change impacts the quality and volume of reports under this policy. Regardless of whether a person selects “Privacy Violation” in their report, however, we will continue to evaluate every report against all of our Community Standards.
We note that the board’s recommendation is not limited to private residential information. To that end, there are additional ways for people to report privacy violations other than the disclosure of residential addresses. These include reporting non-consensual intimate imagery under the Adult Sexual Exploitation policy in the Community Standards and using the Help Center to report other kinds of photos and videos that might violate their privacy.
Next steps: We expect to have results from the experiment later this month. We’ll determine next steps based on the findings, and anticipate sharing additional information in a future Quarterly Update.
Meta should clarify in the Privacy Violations policy when disclosing the city where a residence is located will suffice for the content to be removed, and when disclosing its neighborhood would be required for the same matter. The Board will consider this implemented when Meta modifies its content policies.
Our commitment: We will explore ways to clarify when identification of a city (versus identification of a neighborhood) is sufficient to satisfy one of the four required components for removal of imagery that displays the external view of private residences.
Considerations: We are considering a number of ways to potentially implement this recommendation, acknowledging the difficulty involved in setting a globally applicable framework given the wide variation in types of localities. In many places, revealing the city in which a private residence is located, along with imagery of the external view, is significantly different than revealing the neighborhood. We may consider policy changes to require both a city and an additional identifier such as a postal code, neighborhood, street or unique landmark. However, those indicators could be (and likely are) different across regions. With this in mind, we'll need to analyze various cases to understand what additional indicators are likely to appear and assess what is feasible for reviewers.
Next steps: We will provide updates on the status of our assessment in future Quarterly Updates, and hope to complete this work by the end of the year.
Meta should explain, in the text of Facebook’s Privacy Violations policy, its criteria for assessing whether the resident is sufficiently identified in the content. The Board will consider this implemented when Meta modifies its content policies.
Our commitment: We will develop internal criteria for our content reviewers to assess if a resident has been sufficiently identified. We are working to publish more of the resources our content reviewers use so our enforcement protocols are more transparent.
Considerations: The board recommends that we clarify whether a person’s full or partial name needs to be shared alongside residential information, or whether other identifiers such as a photo sufficiently identify a resident. We will work to develop and publish criteria that more clearly outlines what we consider to be identification of a resident with a goal to protect privacy and prevent potential harm.
Next steps: We will provide updates in future Quarterly Updates, and hope to complete this work by the end of the year.
The Board reiterates its recommendation that Meta should explain to users that it enforces the Facebook Community Standards on Instagram, with several specific exceptions. The Board recommends Meta update the introduction to the Instagram Community Guidelines within 90 days to inform users that if content is considered violating on Facebook, it is also considered violating on Instagram, as stated in the company’s Transparency Center, with some exceptions.The Board will consider this implemented when Meta modifies its content policies.
Our commitment: We will update the Instagram Community Guidelines to match the Facebook Community Standards and identify the small number of instances where the policies differ.
Considerations: As we explained in our response to recommendation 1 from the case about a Post Discussing a Substance with Psychoactive Properties, (2021-013-IG-UA) we do not believe adding a short explanation of the minor differences between the two apps’ policies to the Community Guidelines introduction will fully address the board’s recommendation, and may lead to further confusion. Instead, we are working to update the Instagram Community Guidelines so that they are consistent with the Facebook Community Standards in all of the shared policy areas. This more sustainable approach is also more time-intensive, so we cannot commit to publishing a change within 90 days. However, we are continuing to make progress on this work, addressing this recommendation in a way that ensures accuracy and consistency by building the necessary technical capabilities so periodic updates in one place are automatically reflected in the other, and that the final product complies with regulatory requirements.
Next steps: We will share progress in a future Quarterly Update.
Meta should let users reporting content that may violate the Privacy Violations policy provide additional context about their claim. The Board will consider this implemented when Meta publishes information about its appeal processes that demonstrate users may provide this context in appeals.
Our commitment: We will incorporate a feasibility assessment to determine the effects of giving people the opportunity to provide additional context when reporting content. This will be added to our product planning process by the end of the year.
Considerations: As outlined in our response to recommendation #6, above, we are testing changes to how people can report potential privacy violations. The results will help us understand how to best increase the efficiency and accuracy of potential violation reports. In the future, additional experiments may include the ability for people to provide more context to support their report as the board recommends. At the moment, however, this functionality is not included in the test.
There are operational challenges associated with increasing the amount of information our teams review as part of receiving feedback from people, which we discuss in the context of appeals in our response to recommendation #2 in the Depicting Indigenous Artwork and Discussing Residential Schools case (2021-012-FB-UA). We will assess whether the additional context adds to the accuracy of review and quality of the user experience. We will also consider the extent to which additional context may slow down review time, limiting the number of appeals we can review at scale.
We plan to assess the board’s recommendation during our regular process for assessing new review tools. The next opportunity to introduce the recommendation into this roadmap will occur later in 2022.
Next steps: In May, we will begin planning product work for the second half of 2022 and will incorporate this recommendation into our considerations. We will provide information about our assessment of this recommendation in a future Quarterly Update.
Meta should create a specific channel of communications for victims of doxing (available both for users and non-users). Additionally, Meta could provide financial support to organizations that already have hotlines in place. Meta should prioritize action when the impacted person references belonging to a group facing heightened risk to safety in the region where the private residence is located. The Board will consider this implemented when Meta creates the channel and publicly announces how to use it.
Our commitment: We are actively building new channels for users to get support. We are in the early stages of these efforts, and at this time we cannot commit to adding specific communication channels, such as one for doxing.
Considerations: We recognize that doxing is a breach to the right of privacy as defined in Article 17 of the ICCPR. We work with external experts, including a Safety Advisory Board, and gather feedback from our community to develop policies, tools, and resources to promote a welcoming and safe environment for everyone. Responding to reports of doxing remains critical to us, which we will continue to do through our existing reporting systems.
We are actively developing new customer support channels, which is a significant undertaking because we need to plan for long-term support, anticipate the ways in which issues may evolve, and protect against ways a channel might be abused. We expect that the development of these new systems will enable us to build new support channels tailored to specific issues more quickly in the future. But, because of these considerations, we cannot commit to building a doxing-specific channel at this time.
However, in addition to our existing reporting channels, Meta works with over 850 safety partners globally, including helplines and organizations that work with victims of harassment, online and offline, like the National Network to End Domestic Violence (US), Revenge Porn Helpline (UK), Digital Rights Foundation (Pakistan) and Center for Social Research (India). We sometimes provide financial, campaign and/or tech support for these partners’ projects. For example, we recently partnered with Revenge Porn Helpline to launch StopNCII.org, an emergency support service for victims of non consensual intimate image abuse, along with over 50 global helplines and victim support organizations. We also support the work of Tech Matters (US) in building and expanding their open source technology platform for helplines to make support services more accessible to people in crisis.
Next steps: We’ll continue to share updates on our Privacy Matters page about our company wide efforts to protect people’s privacy. We will have no further updates specific to this recommendation.
Meta should consider the violation of its Privacy Violations policy as “severe,” prompting temporary account suspension, in cases where the sharing of private residential information is clearly related to malicious action that created a risk of violence or harassment. The Board will consider this implemented when Meta updates its Transparency Center description of the strikes system to make clear that some Privacy Violations are severe and may result in account suspension.
Our commitment: We are exploring ways to incorporate elements of this recommendation into how we enforce violations of our Privacy Violations policy.
Considerations: In most cases, when someone violates the Community Standards, we apply a strike to the account. Whether we apply a strike depends on the severity of the content, the context in which it was shared and when it was posted. For some severe policy violations and repeated violations, we may apply additional restrictions. To assess the board’s recommendation, we will first need to determine how to identify when sharing private residential information “is clearly related to malicious action that created a risk of violence or harassment” and how we can log this type of information in our system.
Next steps: We will continue to scope ways to implement these changes and provide an update in a future Quarterly Update.
Meta should give users an opportunity to remove or edit private information within their content following a removal for violation of the Privacy Violations policy. The Board will consider this implemented when Meta publishes information about its enforcement processes that demonstrates users are notified of specific policy violations when content is removed and granted a remedial window before the content is permanently deleted.
Our commitment: We will continue to develop tools that provide people with greater control over their experiences on Facebook and Instagram, including opportunities to avoid inadvertent Community Standards violations. As part of this ongoing work, we will conduct research on the value of tools that would allow people to edit a violating post after Meta removes it.
Considerations: To provide people with better information, we are working on an early warning system to notify people when content they intend to post may violate our policies. We believe that this system will achieve the spirit of this recommendation by providing people the opportunity to edit or not post potentially violating content.
In addition to our work to notify people about potential violations before they post content, we will research the value of tools that would allow people to edit a violating post after Meta removes it in order to avoid a penalty against their account. This experiment will explore giving people the opportunity to edit or remove the parts of a post that violated our Community Standards.
Next steps: In May, we will begin planning product work for the second half of 2022 and will incorporate this recommendation into our considerations. We will provide an update on this work in a future Quarterly Update.
Meta should let users indicate in their appeals against content removal that their content falls into one of the exceptions to the Privacy Violations policy. The Board will consider this implemented when Meta publishes information about its appeal processes that demonstrates users may provide this information in appeals.
Our commitment: We’re continuously working to improve our appeal process, both for the benefit of user experience and for accuracy of enforcement. We will explore ways to allow people to provide additional context with their appeal, including the ability to identify a specific policy exception in the Community Standards that someone believes applies to their content.
Considerations: As we describe in our response to recommendation #10, there are considerable challenges to increasing the amount of information we receive with feedback from people, such as reports and appeals. We continue to plan potential product experiments in this space, and will track ongoing work as part of our updates to recommendation #4 in the Armenian people and the Armenian Genocide case (2021-005-FB-UA).
Next steps: In May, we will begin planning product work for the second half of 2022 and will incorporate this recommendation into our considerations. We will provide additional information in a future Quarterly Update.
Meta should publish quantitative data on the enforcement of the Privacy Violations policy in the company’s Community Standards Enforcement Report. The Board will consider this implemented when Meta’s transparency report includes Privacy Violations enforcement data.
Our commitment: Consistent with our previous commitments to similar board recommendations, we will continue to publish more data about our enforcement actions in the Community Standards Enforcement Report (CSER). However, given existing priorities, there is currently no plan to add Privacy Violation policy metrics to the report.
Considerations: We’re continuing to expand and identify the types of data to include in the CSER, including presenting prevalence data by location, automation versus manual enforcement and error rates. These areas follow from our commitments in recommendation #3 in the Punjabi Concern Over the RSS in India case (2021-003-FB-UA) and recommendation #6 in the Brazil Breast Cancer Symptoms and Nudity case (2020-004-IG-UA). We’re also adding new policy areas into the Report. For example, Violence and Incitement metrics were added to the CSER in Q3 of 2021.
In determining which policy areas to add to CSER, we consider factors like the quality of the data available in a given policy area and external stakeholder feedback on the most pressing policy challenges. Based on these factors and given our existing commitments to expand the CSER as outlined above, Privacy Violation policy metrics will not be added to the report in the foreseeable future.
Next steps: We will have no further updates on this recommendation.
Meta should break down data in its transparency reports to indicate the amount of content removed following privacy-related government requests, even if taken down under the Privacy Violations policy and not under local privacy laws. The Board will consider this implemented when Meta’s transparency reporting includes all government requests that result in content removal for violating the Privacy Violations policy as a separate category.
Our commitment: In line with our previous commitments in other cases, we will publish more information about our government requests for content removals. However, we don’t plan on presenting this information by specific Community Standards policy areas.
Considerations: As we described in our Q4 2021 Quarterly Update on the Oversight Board and in our response to recommendation #11 in the Support of Abdullah Ӧcalan, founder of the PKK case (2021-006-IG-UA), we’re continuing to build out our internal systems to create the data needed for additional reporting on government requests for content removals, and will provide an update on our timeline in a future Quarterly Update. Because we are prioritizing substantial changes to our internal systems to enable this level of reporting, we will not present this data by policy area in the foreseeable future. For additional information about our approach to data transparency in specific policy areas, please see our response to recommendation #15.
Next steps: We will have no further updates on this recommendation.
Meta should provide users with more detail on the specific policy of the Privacy Violations Community Standard that their content was found to violate and implement it across all working languages of the company’s platforms. The Board will consider this implemented when Meta publishes information and data about user notifications.
Our commitment: We will continue to identify and implement the best ways of providing clear, specific messaging to people when we enforce our policies, in line with our previous commitments to similar board recommendations.
Considerations: Improving the effectiveness of our messaging to people on our technologies is one of the central goals of our integrity efforts. In line with our continued work in response to recommendation #1 in the Armenians in Azerbaijan case (2020-003-FB-UA), we have teams dedicated to improving experiences for people when we remove their content and continue to update our notifications to tell people why we removed a post.
As previously described, when a content reviewer determines that a post violates one of our policies, they often indicate the type of violation, but not always the specific policy line. Additionally, when we build technology to take automated action, it is often based on the policy area rather than an individual line in the policy because our automated systems are the most accurate at the policy level. Incorrect, yet specific messaging could create worse experiences than correct, broader messaging. We understand the benefit in additional detail and will continue to explore how best to provide additional information to people when we remove content that violates our Community Standards.
We’ve kept this commitment as “Implementing in Part” because we’re building and testing these messages to see what potentially improves a person’s experience, but we don’t know yet if we’ll launch these changes to all policy lines of the Community Standards.
Next steps: We will track progress on this recommendation under our continued work following recommendation #1 in the Armenians in Azerbaijan case (2020-003-FB-UA). Because this work is gradual and often does not result in immediate, observable product changes, we provided the board an in-depth briefing on our ongoing implementation of these product recommendations in March 2022 and will continue to update the board on our progress.