This week, a former employee testified before U.S. legislators that Facebook knew its apps and policies caused harm to users and consistently chose profits over public wellbeing. This is the latest in a string of events that cast the company in a negative light. From data misuse to misinformation, these complex issues have often ignited outrage and indignation without leading to consensus on the core problems or viable pathways for reform.
Full disclosure: Facebook has been a registered partner since 2015 of the Global Partnership for Sustainable Development Data, which houses the Data Values Project (and where these authors are employed). Facebook works with others in the data for development space including the OECD and the World Bank and has had a presence at UNGA. As individual practitioners and part of this network, we need to openly consider the ethical implications of our partners’ actions and reflect on how to be a catalyst for change.
To start, here are some fundamental tensions that have hindered change and difficult questions that must be addressed to make progress:
Facebook’s business model is that of a private enterprise, but it’s perceived by users as a public good. Facebook, Inc. is a publicly-traded technology company that earned $84 billion last year, primarily from advertising. Yet advertisers—Facebook’s customers in the traditional sense—make up a fraction of the company’s 2.8 billion daily active users, the majority of whom are outside the U.S. and Europe. To these users, Facebook’s family of apps provide free (i.e. non-excludable) and unlimited (i.e. non-rivalrous) services, what economists call “public goods”— things like public parks or clean air that are usually provided or protected by governments.
This model creates trade-offs between prioritizing profits through expanding user engagement and minimizing social harms. Its sheer number of users makes Facebook attractive to advertisers and enables the company to generate huge revenues. To increase user engagement, the company has to reach as far as possible into all aspects of people’s lives to make itself essential. Facebook’s leadership consistently chooses to maximize profits, as evidenced through the company’s internal research that showed Instagram has a devastating effect on vulnerable teenagers. Given the wide reach and social impacts of these platforms, is classifying Facebook as part of the private sector adequate? What is a private company’s responsibility to the public in this situation? How should governments define what to regulate? And who can hold such an enterprise accountable?
Facebook also occupies the space of a public utility, providing critical infrastructure in many countries. Facebook is the route to connectivity in many low resource settings. The company quietly supplies internet access to between 100 million and 300 million people (for more on this, see this blog post from MIT’s Global Media Technologies and Cultures Lab). Facebook Connectivity and other initiatives provide physical and digital infrastructure and hardware and software across the world. Timely data on these initiatives is not publicly available—particularly after the backlash to Free Basics in India.
Connectivity illustrates the tensions and trade-offs between profits and public benefits that run through all of Facebook’s activities. In this case, Facebook works to increase internet access in remote areas to expand its local presence, gain more users, and increase revenue. Yet this also meets an urgent need as nearly half the world lacks digital connectivity—a problem that the UN Secretary General has laid out as a global responsibility. The central question for governments is how to guarantee universal access at low cost. This is not an insoluble problem, but it’s one that too few governments are managing well.
The twentieth century has plenty of examples of companies that prioritized profit over people’s wellbeing until citizens advocated for change and the government stepped in to regulate. Facebook and other large tech companies have clearly reached this stage. But should responsibility for regulating Big Tech lie solely with governments? As a community, we can no longer delay responding to issues that have come up again and again. Networks like the Global Partnership and initiatives like the Data Values Project are essential to finding common ground amidst conflicting interests and charting a pathway toward change.