This week I joined 19 other leaders from civil society, academia and the media around the world as a member of the new Oversight Board for Facebook and Instagram.
This is an independent body that will make binding decisions on whether certain content on Facebook and Instagram should be removed or allowed to stay, taking into consideration freedom of expression and human rights principles.
From the outset, let me stress that the board is not part of Facebook’s operations. We are not Facebook employees. Rather, the board is set up to help improve the governance of social media companies for the benefit of the people. To put it bluntly, without independence, there can be no credible oversight.
As Facebook has grown in size and exerts a huge and growing impact on the lives of its global community of 2.4 billion people and beyond, this task on content moderation has never seemed more important.
Facebook’s operation today transcends national borders. Almost no part of the world is untouched, directly or indirectly, by Facebook and Instagram. It has dramatically changed our lives and the way we work, communicate and interact with one another. It has given users the chance to connect and reconnect efficiently and effectively. And it has given voice to the voiceless, enabling people to take part in the decision-making processes and debates that affect their lives.
But as with other communication technologies, there are inevitable abuses. We have witnessed the proliferation of fake news, hoaxes, hate speech and incitements to violence of social media content. Some of these have directly and indirectly led to human rights violations. While Facebook may not be the source of these postings, it has inadvertently facilitated their spread. It cannot shirk its responsibility.
The Oversight Board does not release Facebook from its responsibility; it is designed to hold the company more accountable by adding a crucial oversight layer on top of Facebook’s existing processes on content moderation.
With multinational companies coming under pressure to ensure business practices that protect and respect people’s human rights, the way the Oversight Board was created and how it operates could be a model for other internet services to use in improving their governance.
The idea for the board was first broached by Facebook founder Mark Zuckerberg in late 2018. The Facebook team has since gone through lengthy and extensive consultations with various stakeholders in creating the board.
Facebook established a US$130 million trust for the Oversight Board, which funds all operations and cannot be revoked. A charter has been drawn up that governs the relationship between the Oversight Board and Facebook, including its independence.
This independence is essential for the board to have any credibility at all and win the public’s trust. While Facebook helped pick the board’s initial four cochairs, it limited itself to playing a supporting role in the selection of the other 16 members. Some of them have been critical of Facebook and would not have joined if independence was not guaranteed.
The diversity of members is essential to the board to reflect as best as possible Facebook’s global operation. The 20 members speak over 29 languages and represent various professional, cultural, political and religious backgrounds. Over time, we expect to grow the board to around 40 members.
In spite of the diversity, there is one thing that unites members: our global commitment to protecting freedom of expressions and human rights anywhere in the world.
When the board begins work later this year, we will hear cases referred to by Facebook or requested directly by users for a ruling on content that should be removed or allowed to stay. Any decision on content the board makes is binding on Facebook. The board will also come up with content policy recommendations for Facebook to consider.
Independence, diversity and transparency will be the guiding principles as the board works to contribute to the strengthening of social media governance.