Meta has committed to implement 24 recommendations, partly implement 7, and assess the feasibility of another 9.
As part of our commitment to help create an environment where people can express themselves freely and safely, and following a recommendation from the Oversight Board in September 2021, we asked Business for Social Responsibility (BSR) — an independent organization with expertise in human rights — to conduct a due diligence exercise into the impact of our policies and processes in Israel and Palestine during the May 2021 escalation, including an examination of whether these policies and processes were applied without bias.
BSR identified a number of areas of “good practice” in our response. These included our efforts to prioritize measures to reduce the risk of the platform being used to encourage violence or harm, including quickly establishing a dedicated Special Operations Center to respond to activity across our apps in real time. This center was staffed with expert teams, including regional experts and native speakers of Arabic and Hebrew, who worked to remove content that violated our policies, while also making sure we addressed errors in our enforcement as soon as we became aware of them. It also included our efforts to remove content that was proportionate and in line with global human rights standards.
As well as these areas of good practice, BSR concluded that different viewpoints, nationalities, ethnicities and religions were well represented in the teams working on this at Meta. They found no evidence of intentional bias on any of these grounds among any of these employees. They also found no evidence that in developing or implementing any of our policies we sought to benefit or harm any particular community.
That said, BSR did raise important concerns around under-enforcement of content, including inciting violence against Israelis and Jews on our platforms, and specific instances where they considered our policies and processes had an unintentional impact on Palestinian and Arab communities — primarily on their freedom of expression.
BSR made 21 specific recommendations as a result of its due diligence, covering areas related to our policies, how those policies are enforced, and our efforts to provide transparency to our users. We’ve carefully reviewed these recommendations to help us learn where and how we can improve. Our response details our commitment to implementing 10 of the recommendations, partly implementing four, and we’re assessing the feasibility of another six. We will take no further action on one recommendation.
Meta commissioned Business for Social Responsibility (BSR) to independently assess our plans to expand end-to-end encryption by default to Messenger and Instagram DMs.
The assessment found that expanding end-to-end encryption enables the realization of a diverse range of human rights and recommended a range of integrity and safety measures to address unintended adverse human rights.
The report includes 45 recommendations. Our response details our commitment to implement 34 of the recommendations, partly implementing 4, assessing the feasibility of 6 and taking no further action on one.
In 2019, Meta commissioned Business for Social Responsibility (BSR) to conduct an independent human rights review to inform the development of governance and operational structures for the Oversight Board.
BSR’s review of this novel remedy mechanism focused on identifying salient human rights impacts and made 34 recommendations to both Meta and the Board across 7 areas: harms and impacts, vulnerable groups, remedy, decision-making, informed consent, safety and integrity, and transparency. This review substantially informed our approach to enshrining human rights principles in the Oversight Board Charter and Bylaws.
A follow-up review and gap analysis by BSR, commissioned by the Oversight Board itself a year later, found that 17 of the original recommendations are benefiting from good progress, 9 are benefiting from partial progress, 5 are not yet determined, and 3 are not yet addressed.
In 2020, Meta commissioned Article One Advisors to conduct an independent HRIA on the role of our services in the Philippines, informed by qualitative and statistically significant quantitative research.
Article One’s HRIA found that Meta’s products play an important role in enabling freedom of expression and association, especially for human rights defenders, journalists, government critics and marginalized communities, and in providing access to economic opportunities. The HRIA also identified a number of salient human rights risks, including online harassment, spread of misinformation and incitement of violence.
The HRIA made 40 recommendations, covering mis/disinformation, online harassment, incitement to violence, online sexual exploitation, trafficking, extremist activity and corporate accountability. Meta has committed to implement 24 recommendations, partly implement 7, and assess the feasibility of another 9.
Meta took steps to address challenges to platform-level grievance mechanisms, conduct ongoing due diligence, increase transparency and use leverage to address root cause challenges.
The HRIA identified a range of positive impacts, including the role of the platform in allowing human rights defenders, journalists and disadvantaged groups to exercise their freedom of expression, association and political participation rights; contributions to economic and cultural rights; and role in improving emergency response. Article One also identified a number of salient risks and adverse impacts, including those associated with hate speech and rumors targeting religious minorities; bullying and harassment of human rights defenders and LGBTQ+ individuals; gender-based attacks; and child sexual exploitation.
In response to the HRIA’s 28 recommendations, Meta took steps to improve corporate-level accountability, evolve our Community Standards, invest in changes to platform architecture, address challenges to platform-level grievance mechanisms, conduct ongoing due diligence, increase transparency and use leverage to address root cause challenges.
Meta took steps to improve corporate-level accountability, evolve our Community Standards and invest in changes to platform architecture.
The HRIA identified a range of positive impacts related to freedom of expression, association and opinion; economic and cultural rights; political participation; and support for vulnerable communities. Article One also identified a number of salient risks and adverse impacts, including those associated with mis/disinformation, government restrictions to freedom of expression, bullying and harassment, privacy, child sexual exploitation and gender-based attacks.
In response to the HRIA’s 37 recommendations, Meta took steps to improve corporate-level accountability, evolve our Community Standards, invest in changes to platform architecture, address challenges to platform-level grievance mechanisms, conduct ongoing due diligence, increase transparency and use leverage to address root cause challenges.