As part of our commitment to help create an environment where people can express themselves freely and safely, and following a recommendation from the Oversight Board in September 2021, we asked Business for Social Responsibility (BSR) — an independent organization with expertise in human rights — to conduct a due diligence exercise into the impact of our policies and processes in Israel and Palestine during the May 2021 escalation, including an examination of whether these policies and processes were applied without bias.
BSR identified a number of areas of “good practice” in our response. These included our efforts to prioritize measures to reduce the risk of the platform being used to encourage violence or harm, including quickly establishing a dedicated Special Operations Center to respond to activity across our apps in real time. This center was staffed with expert teams, including regional experts and native speakers of Arabic and Hebrew, who worked to remove content that violated our policies, while also making sure we addressed errors in our enforcement as soon as we became aware of them. It also included our efforts to remove content that was proportionate and in line with global human rights standards.
As well as these areas of good practice, BSR concluded that different viewpoints, nationalities, ethnicities and religions were well represented in the teams working on this at Meta. They found no evidence of intentional bias on any of these grounds among any of these employees. They also found no evidence that in developing or implementing any of our policies we sought to benefit or harm any particular community.
That said, BSR did raise important concerns around under-enforcement of content, including inciting violence against Israelis and Jews on our platforms, and specific instances where they considered our policies and processes had an unintentional impact on Palestinian and Arab communities — primarily on their freedom of expression.
BSR made 21 specific recommendations as a result of its due diligence, covering areas related to our policies, how those policies are enforced, and our efforts to provide transparency to our users. We’ve carefully reviewed these recommendations to help us learn where and how we can improve. Our response details our commitment to implementing 10 of the recommendations, partly implementing four, and we’re assessing the feasibility of another six. We will take no further action on one recommendation.