To reduce the reporting burden on targets of Bullying and Harassment, Meta should allow users to designate connected accounts, which are able to flag potential Bullying and Harassment violations requiring self-reporting on their behalf.
The Board will consider this recommendation implemented when Meta makes these features available and easily accessible to all users via their account settings.
Commitment Statement: We will evaluate the feasibility of allowing people connected to a user to report content requiring self-reporting on their behalf, as well as looking for opportunities to foster partnerships expanding the ability of designated entities to report potentially violating content, particularly on behalf of youth.
Considerations: Ensuring the safety of users on our platforms is consistently a high priority, and in doing so we strive to iterate and improve their ability to report or escalate content such as bullying and harassment. Our Bullying and Harassment policy applies certain protections for everyone, regardless of reporting context. However, for less severe tiers of our policy, we apply different protections for different individuals, such as adult public figures and private individuals. In order to allow discussion such as banter among friends or neutral commentary, we may require self-reporting as it provides context to help us understand if the person reporting content feels bullied or harassed.
In response to this recommendation, we will assess if there are ways that we may leverage existing tools for reporting content while still maintaining self-reporting as a key contextual signal for understanding if content may be considered as bullying and harassment by an individual as opposed to legitimate discussions and speech. Allowing others to report on behalf of a person is technically difficult given the way our review systems function at scale and may be subject to abuse, but we will explore options and provide an update on this work in a future report.
Beyond the context of self reporting, we have also taken steps recently to prioritize certain reports more generally for review under our Community Standards. Earlier this year, following our launch of Instagram teen accounts we introduced the School Partnership Program for Instagram, a program partnering directly with schools and teachers to address bullying. Through this program, reports submitted by school partners that may violate Instagram’s Community Standards will be prioritized for review. However, policy areas that require self-reporting will still need a match between the target and the reporter. Additionally, schools receive status updates on the reports and notifications as soon as Instagram takes action on the report. The program is currently open to middle and high schools in the US. As part of our standard process, we allow parents to request the removal of violating content on behalf of children under 13 years old.
We are committed to exploring additional opportunities to provide services in instances where users may suddenly become public figures or highly visible on our platforms. This will require collaboration across our Product, Policy, Partnerships, and Operations teams to identify any possible avenues for expansion.
We will provide updates on the status of this recommendation in future reports to the Board.