The social media giant Facebook knows it has a problem and thinks it has figured out a way that can lead to a solution in the longer term. It is hiring a Director for Human Rights Policy. This is only a single step in a long journey, but does show the company recognises the need for a senior executive in charge of human rights.

But if Facebook, its investors, or its critics believe that this step in itself would make any fundamental change in the way Facebook operates, they are in for a major disappointment. For what Facebook needs is not someone brought in with a magic formula to transform the company into a human rights champion. It has to reassess what it is, how it does business, and how it values of those who use it, considering the money it makes by trading their hopes, desires, and profiles.

What Facebook needs is not someone brought in with a magic formula to transform the company into a human rights champion.

The problems Facebook faces can overwhelm any corporation.

As I’ve reflected on in previous columns, its lax controls assisted the Myanmar army to perpetrate genocide against the Rohingya community in Myanmar, according to the UN and other experts, creating one of the worst refugee crises of all time. Its algorithms have been accused of promoting hate speech. Its network effect has been exploited by foreign entities to attempt to influence elections in several democracies, including the United States and the United Kingdom. Its self-monitoring policies have failed to stop bullying of women on its platform. It often acts too quickly and snuffs out voices that speak truth to power and challenge political and religious orthodoxies. And there are massive privacy implications. 

This is only a partial list of issues in the in-tray of the incoming director of human rights policy. It is an intimidating list. And it is something no individual can achieve on her (or his) own.

 

An Overflowing In-Tray

Developing a human rights policy – that recognises and respects the rights of its users is the starting point. Having a team of individuals who can evaluate between competing rights – a woman blogger’s right to safety versus her critic’s right to free expression – is the next step. Appointing teams of people applying common international standards to ensure free expression is the one after that.

Facebook is also going to need people, even if assisted by machines, to identify those who are gaming or manipulating the system to secure unfair advantages. Facebook sees itself as a platform where people meet. But it owns that space. And it does set rules by which people operate within that space. It has to enforce those more vigorously, bearing in mind human rights impacts. Those who manage markets ensure that no player cheats; Facebook has to apply similar rigour in dealing with those offering products, services, and ideas. 

Facebook would need accountable executives who are experts themselves, who are credible to other experts, and who can inspire the organisation.

To do all this, Facebook will need help from outside – from human rights experts, lawyers, human rights defenders, academics, journalists, activists, privacy rights specialists, experts on culture, and those with an understanding of vulnerability of marginalised voices. This cannot only be a small committee of individuals (or institutions) with an advisory role. But it would need to significantly enhance the capacity within the organisation. It would need accountable executives who are experts themselves, who are credible to other experts, and who can inspire the organisation – vast as it is – to rally round its new mission – of doing business as if the people using it services or affected by its services matter.

 

Job Number One is a Culture Shift

This requires a real cultural change – and rethink about the kind of business it is. It means every Facebook employee has to consider the human rights consequences of his or her actions. That will take time, and won’t be easy. Facebook will have to develop new systems of incentives, so that its staff is not rewarded only for achieving financial or other measurable quantitative targets, and those who turn down opportunities that could be financially lucrative but could have adverse human rights impacts are not penalised. 

Every Facebook employee has to consider the human rights consequences of his or her actions.

Engineers in some companies in the extractive industry know that acquiring access to land takes time, and they can no longer rely on security forces to evict communities so that the project can be completed on time. Pharmaceutical companies have realised that testing new drugs on people would need informed consent of the participants, and that too takes time. Facebook will have to set aside time and listen to communities and individuals likely to be affected, and design products and services bearing in mind their likely impacts. 

Facebook’s real estate is the platform where it it aims to make money from the transactions users undertake. As stated earlier, it can – and does – set rules. Its challenge is to make sure that those rules are consistent with international human rights standards. It means protecting the privacy of individual users and groups in environments where such users may be at risk (such as sexual minorities, non-violent dissidents, or trade unionists). Facebook can offer safe space to such groups on its platform, and protect their anonymity. 

Facebook employees need to understand this each time they sign a contract with an advertiser, each time they permit a group to come into being, each time they identify a user against whom there is an organised attack, each time a government or a powerful group makes a take-down request, each time there is an unusual spike in traffic. 

 

Playing Catch Up

To be sure, Facebook does much of that, being part of the Global Network Initiative and having its own discrete policies to address credible violence and hate speech. But those broad principles on freedom of expression, privacy, responsible decision-making, multi-stakeholder collaboration, and governance, accountability, and transparency need to be translated in detail as to how Facebook can implement them meaningfully. 

The new Human Rights Policy Director will need to look for early warning signs and red flags to which it must pay heed.

As Facebook knows the problems to avoid, the new Human Rights Policy Director will need to reverse-engineer the process and look for early warning signs and red flags to which it must pay heed, and develop responses and policies to ensure that such incidents do not recur. 

It won’t be easy. Facebook began as a fun activity for students at a campus dorm. Like many companies, it has lacked the expertise, the capacity, authority, mandate, or skills to act in ways to respect human rights. It has been in the real world for a long time now, but this position could be the start of the company playing catch-up. 

 

 

Photo: Unsplash/@mrthetrain

Latest IHRB Publications

How should businesses respond to an age of conflict and uncertainty?

As 2024 began, European Commission President Ursula von der Leyen aptly summed up our deeply worrying collective moment. As she put it, speaking at the annual World Economic Forum in Switzerland, we are moving through “an era of conflict and...

Bulldozer Injustice: how a company’s product is being used to violate rights in India

Bulldozers have been linked to human rights violations for many years, at least since 2003 when the US activist Rachel Corrie was crushed to death by a Caterpillar bulldozer while protesting against the demolition of a Palestinian home with a family...

The state of just transitions in the cocoa sector

The mounting impacts of the climate crisis are seen starkly in the lives of agricultural workers, most often in developing countries. Discussions around just transitions understandably focus on energy, but agriculture and deforestation are also huge...

{/exp:channel:entries}