Since UN Secretary-General Ban Ki-moon called for a global effort to bring about a data revolution in sustainable development in 2014, it’s become increasingly clear that our personal data is not something abstract stored in cyberspace. To the contrary, current events have shown now more than ever the extent to which our data reflect ourselves and the ways data can be used both for and against us. Data holds power to improve lives. But it often can also be used against people. This is especially true for activists, members of minority communities, and persecuted people from diverse backgrounds.

Today, while data for development remains high on the international agenda, concerns about governance and power dynamics are rising related to a range of issues from tech monopolization, vendor lock-in and exploitative business models to regulatory gaps and the unequal distribution of the benefits of the data revolution. Yet connections between these issues and sustainable development remain vague, confusing, and poorly understood. In the face of increasing evidence of the results of malicious data practices, it’s time for an honest conversation about the role of development practitioners in the complex web of global data governance. Do our actions always empower people and reduce inequities or do they also contribute to them? 

Event Summary

The Data Values Project aims to uncover and foster discussion on these issues and to shed light on data governance challenges related to sustainable development. Dissecting digital inequities and addressing data governance require a global approach. In this respect, bringing together digital rights activists and non-governmental/humanitarian sector practitioners is important to break down sector barriers and create a broad and much more powerful conversation about potential solutions. 

This online conversation—Dissecting Digital Power Inequities: reflections from digital rights experts for development practitioners—aimed to: 

  • Introduce members of the data for development community to ongoing work by digital rights activists.
  • Highlight current issues related digital rights, data governance and power inequity in the context of the data revolution for sustainable development.
  • Foster discussion and mutual learning among digital rights activists and data for development practitioners.

The following activists and practitioners participated in this event: 

  • Renata Avila, Human Rights Lawyer and Activist, Guatemala
  • Tom Orrell, Managing Director, DataReady, UK (Moderator)
  • Linda Raftree, Independent Consultant, USA 
  • ‘Gbenga Sesan, Founder, Paradigm Initiative, Nigeria
  • Dr. Linnet Taylor, Associate Professor, Tilburg Law School, The Netherlands

The following recommendations and themes emerged during the event as key for development practitioners to consider and use to shape conversations around digital inequities and data governance:

Recommendations 

  1. Acknowledge harmful business models to rethink data practices in development. Current data business models and practices—even in the context of development—can exacerbate the digital divide, trigger profound inequities, and produce new challenges to the goals of sustainable development. The development community must rethink its data practices and approaches to avoid reproducing harmful models and further disempowering marginalized communities. 
  2. Center conversations about data justice. The principles of data justice should guide discussions around individual and community privacy rights, data sovereignty, and access to key infrastructure. Dialogue is important but not sufficient: Regulations—especially at the national level—are important to protect people and lay the groundwork for a fairer data economy. Practitioners should incorporate principles of ‘data justice’ into the context of data for sustainable development, including issues of inclusiveness, data disaggregation, and more.
  3. Support fair data economies and responsible governance. Government actions impact public trust in the data economy. Governments strengthen trust by establishing rules and digital rights protections that give citizens control of their personal data and datasets that touch upon their lives. Governments erode trust when they take negative or repressive stances toward digital spaces, reduce individuals’ digital rights or limit freedom of expression online, as is currently happening in a number of countries around the world. The development community should monitor and highlight government abuses while sustaining country-level efforts to establish fair, rules-based data economies. 
  4. Put people at the heart of data design. Designing data systems for people and empowering citizens to control their personal data are two essential steps to addressing data inequities. People need access to the design-stage process of data collecting, analysis, dissemination, and use. And, they need the skills, knowledge, and agency to determine how their personal data is used. This requires investing in research, alternative models, and creative experimentation to empower historically excluded communities to develop and implement local solutions for their digital future.
  5. Level up technical capacity and data practices in development. The development and humanitarian sectors must step up their game when it comes to data practices. It’s time for data awareness to be mainstreamed within these sectors. Donors also play an important role: They should take a hard look at their own policies and practices around data and digitalization to assess the roles they play in structuring incentives. Donors should take the lead in discussing best practices, sharing lessons learned, and rewarding good data approaches.

Key themes

Understanding the digital divide

The digital divide encompasses two primary groups: people who are not (yet) connected to digital technologies or infrastructures and those who are intentionally disconnected by governments. Both groups include vulnerable people. There’s already strong global consensus and support for bridging the digital divide for those who lack access to infrastructure, but working with those who are intentionally disconnected is also crucial. One recent example of an intentional disconnection is the recent ban on Twitter in Nigeria. Critical voices are integral to democratic societies but are often perceived as a threat. Governments tend to block access to digital spaces that are among the few remaining places for public discourse. The consequences can be tragic as muting such conversations creates a “black market’ for agitation,” as ‘Gbenga Sesan described, and decreases public trust in both public and private-sector institutions. 

Trust is essential to the data economy. While people can simply stop using the services of a private-sector provider whom they do not trust with their data, the same is not true for government-held data. For governments to lose trust from citizens, for example, during a pandemic, opens the doors to disinformation and a larger breakdown in trust in government sources and operations. 

Acknowledging power imbalances and power dynamics 

There's an ongoing clash between international human rights and the prevailing business models of Big Tech, which rely on extractive and non-competitive practices. These two frameworks overlap in development and are incompatible in their current shapes. This scenario demonstrates the contradictions inherent to today’s data economy. The technological infrastructure we rely on is outside the control of countries and in the hands of a few private players who cannot be challenged or circumvented and who—despite their global reach—are subject to the national security laws of a select few countries. “If you offend one of these tech superpowers, you can suffer very serious consequences and be disconnected from the digital world,” ‘Gbenga Sesan explained. The current technological infrastructure reflects conceptions of surveillance capitalism and is inherently political.

Development practitioners need to expose these incongruences and experiment with new models to address these power imbalances. As Renata Avila said, “the key is to unlock the power of innovation [is] allocating enough resources to pilot new approaches, especially within communities that do not receive resources for things like this very often. We give millions to startups to play and fail with very little accountability.” However, the development sector works by investing little in innovative data practices and maintaining high expectations. We in development must prove we can create models that are different from the private sector’s and break our dependency on Big Tech.

Unpacking the concept of data justice

Visibility, autonomy, and upholding individual rights should be the starting point for conversations about data justice. Dr. Linnet Taylor has written extensively on data justice in her work, which focuses on the connections between surveillance and development. She identified these three principles that help frame data justice: 

  1. Visibility. Individuals and vulnerable communities should have the right to choose whether and how they are visible to the world through their data. Data justice includes issues of not only privacy but also infrastructure and data sovereignty. 
  2. Autonomy. No one should be required to use technology that is not beneficial to them. And, likewise, people should have access to beneficial technologies. 
  3. Responsibility for upholding data rights. Data justice calls for an approach to responsibility and accountability which is not individual but governmental. The current system, which places the burden of enforcing rights in the hands of individuals, is broken and should be replaced by a system in which data controllers are responsible for upholding rights. This shifts the focus from ‘rights holders’ to ‘duty bearers’ who are responsible for enforcing protections.

Consent is deeply problematic, and principles of protecting personal privacy, such as data minimization or purpose limitation, are rarely applied in the development sector. Non-governmental organizations are often more accountable to donors than to local citizens and apply data practices that would not be tolerated in the country where the NGO is headquartered. To be effective, not only soft power but also data justice-focused legislation and government intervention are needed to protect citizens. 

Protecting the rights of children and marginalized groups

The consent mechanisms we use for adults do not apply to children, as they require levels of awareness about the consequences of data misuse and knowledge about their rights which children often do not possess. Data practitioners must explore other mechanisms to protect children’s rights. We must seek to balance children’s rights, including their right to participate in data collection and systems and the potential benefits and harms of collecting and using children’s data. 

It’s also important to consider the unintended negative effects of collecting and using data from people who are vulnerable or face greater risk of harm. A recent case in which UNHCR (inadvertently) shared data identifying Rohingya refugees with the Myanmar government demonstrates a situation in which people were harmed despite good intentions of the international community. This happens when we ignore people’s concerns and rights. It is vital to separate consent for a single use of someone’s data from consent for ongoing or multiple uses. We must return control of personal data to people from vulnerable communities who are the true owners of their data and at the highest risk of personal harm. 

Strengthening the roles of practitioners and donors 

It’s time to mainstream technology and digital innovation within humanitarian and development circles, as happened with gender in the 1990s. Organizations must develop their own points of view instead of relegating technology decisions to private sector providers. Tools and approaches offered by large tech companies in the development and humanitarian domains are inadequate when they simply repurpose methods from sectors such as law enforcement and surveillance. Development sector organizations must understand what in-house capacity is needed and choose the right data partnerships. 

Donors should also examine how they skew incentives, the implications of choosing implementing partners and negative externalities that result from digitally-driven development. Donors should seek to reward good data practices and not shy away from organizations that rightly report data breaches and related concerns. 

Enabling control of our personal data and visibility by putting people at the heart of design

People need to be at the center of technology design. People-centered design first involves enabling people to access the design rooms and processes of applications, systems and software which affect them, such as public sector digital services for schools, housing, and social welfare. Access to these spaces is concentrated within a handful of countries—mainly in the U.S. and China—and among a few, large companies. The global power imbalances that deny wider access to technology design need to change. This is critical to restoring trust in the data economy and empowering citizens. 

Secondly, we must raise awareness and create space for people to shape how and for what purposes their personal data is used. We need to give people the chance to learn about the data life cycle, the benefits and harms of opting out of providing their personal data and how data partnerships affect them. This requires investing in education and programs to build knowledge and trust. Along with governance models that put people at their center, awareness, and agency can help facilitate people’s control over their data. Where this control cannot be exerted directly by citizens (i.e. as in the case of children and other vulnerable populations), we must identify and work with legitimate intermediaries that can act on behalf of citizens and legitimize data use.

Finally, we need to allow space to research and experiment with alternative models. This also means investing in research connecting digital rights to innovative and creative solutions. The level of investments from donors and the international community is currently insufficient. More should be done to allow communities to learn, play, and experiment with data and to develop solutions that work for them. 

The online conversation, Dissecting Digital Power Inequalities, was recorded live on June 22. Tom Orrell, founder and managing director of DataReady, organized and led the discussion. His blog post on organizing this event is available here. Beverly Hatcher-Mbu of Development Gateway and Martina Barbero of the Global Partnership for Sustainable Development Data contributed to this summary.