Companies hold more data than most governments these days. How they use it can help or harm you. Whose job it is to protect your data remains up for debate in most of the world as the responsibility by default falls to corporations.

This month, Google announced it would automatically delete some user location data, including visits to abortion clinics, domestic violence shelters, and other potentially sensitive locations. This decision came in the wake of the U.S. Supreme Court’s decision to overturn Roe v. Wade, which had guaranteed Americans access to abortion since the 1970s. Some are glad to see companies like Google stepping up. Others—including the company’s own employees—argue that there’s a lot more Google can and should be doing to protect people’s data rights and privacy. Why are we looking to large corporations to protect digital rights in the first place? As we debate whether this role belongs to corporations or governments, the rightful role of people like you and me is getting lost.

“Americans are treating tech companies like a substitute for effective representative government,” Shira Ovide argued recently in the New York Times. After all, most large tech companies hold more personal data than governments, insights from which can be used to address social welfare and public safety needs. Big tech can act on a scale that rivals the reach of many governments. Meanwhile, disillusioned citizens who lack confidence in governments to meet their needs are looking to corporations to fill the gap.

In Europe, lawmakers aren’t waiting for corporations to regulate themselves. Earlier this month, they approved sweeping tech regulations through the Digital Markets Act (DMA) and the Digital Services Act (DSA) which are intended to curb the dominance of “digital gatekeeper platforms” and require companies to do more to police illegal content online. Both regulations include financial penalties for non-compliance. 

What’s the right balance between corporations and governments in protecting digital rights? We should be realistic about the limits of corporate responsibility. Corporations are fundamentally driven by the bottom line and are accountable to shareholders. They are naturally oriented to their customer bases rather than being guided by actions that lead to more equitable outcomes for society. In Europe, where governments are taking proactive steps to regulate corporations, there are also deep concerns about the resources and capacity available for enforcement. 

The distinction between responsibility and accountability is at the core of these debates. Corporations are increasingly expected to take on more responsibility for social causes, including protecting people’s privacy and digital rights. And, they’re responding because it’s what a significant proportion of customers want. This is a form of self-regulation in which companies themselves define the actions or measures they will put in place. While we should absolutely be pushing the limits of corporate responsibility, it’s not enough. Corporations can’t be expected to consistently self-regulate when doing so conflicts with their commercial interests.

Unlike responsibility, accountability is a two-way street. It requires checks and balances, watchdogs, and oversight mechanisms. It requires deliberation and feedback on the rules, regulations ,and actions taken to protect people. The regulation and enforcement approach in Europe is pushing corporations from responsible to accountable practices. 

One important finding from the Data Values Project’s year of public consultations was that people want to see all powerful actors—whether corporations, governments or international organizations—held accountable when it comes to data, particularly where marginalized people and communities are concerned. The Data Values white paper, published today, argues that involving people in decisions about their data can strengthen accountability in data governance. Through increased participation, individuals, communities, or their representatives can weigh in on decisions about how much data should be collected, how their data will be used, and whether it can be sold or shared and under what circumstances. Formal laws, regulations and enforcement are essential, but they rely on enforcement and penalties after harm takes place. Fostering public participation ensures that public and private institutions are accountable to people as data-related decisions are made.

Rather than just arguing about whether governments or corporations should do more to protect people’s privacy and data rights, we should be working on how to consistently bring people into the discussion by advocating for public participation in data-related decisions. 

The challenge is how to do this at scale. Many methods of citizen engagement and participation already exist, but they are often slow and onerous. Reimagining Data and Power argues for building informal participatory mechanisms that supplement formal laws, policies, and institutions and create more checks and balances to increase accountability. This requires creativity, working with representative organizations, and establishing committees or groups that can aggregate communities’ perspectives and represent them authentically. (For more on participatory mechanisms, check out the examples in this webinar on multi-stakeholder data governance.)

Asking companies to engage affected communities should also be part of this. They’ve perfected the art of market research, pioneering market surveys, focus groups, and user experience research. Likewise, many large companies should be able to engage their consumer base on questions around data collection, sharing, and use for new products and services. Mastercard’s Data Responsibility Principles, including the need to give individuals the ability to control the use of their data, signal a step in this direction. 

The more that companies do this through iterative engagement with their customers, the more that the two-way pressure of accountability will help burnish their reputation and limit the potential for harm and scandals. It would be naive to expect corporations to fully take on the equity challenges of society, but pressure can be exerted, consumers can be even more demanding, and there’s more big corporations can do.

In the data for good space, we’ve firmly adopted the language of responsible data production, sharing, and use. Perhaps it’s time to shift from responsibility to data accountability.