Skip to main content Skip to secondary navigation
Page Content
Image
Illustration of a human head made up of swatches of different bold colors.

Digital technologies can be a force for good, but too often they emerge from sources reflecting inequality and may amplify existing racial bias and discrimination.

“Each of us has a part to play in building the world we want and envisioning ways forward,” said Ruha Benjamin, a Princeton University professor and the keynote speaker for Stanford’s first Technology and Racial Equity Conference, held May 20-21, 2021, and sponsored by  Stanford Center for Comparative Studies in Race & Ethnicity (CCSRE), Digital Civil Society Lab (DCSL) at the Stanford Center on Philanthropy and Civil Society, the Stanford Institute for Human-Centered Artificial Intelligence (HAI), and the Stanford Program in African & African American Studies.

“We can’t leave technology development and monitoring merely to those who have the technical know-how,” Benjamin continued. “The experiences and insights of the marginalized matter.”

Scholars, policymakers, technologists, and civil rights activists including leaders from USAID, Stop LAPD Spying Coalition, and the Harvard Kennedy School spoke on topics ranging from blockchain, policing technology, and digital agrifood—all through the lens of racial equity. Here are their most compelling takeaways.

For more, watch the full two-day conference here.

Barcelona: A People-Centric Smart City

Technology is a critical part of creating people-focused, socially just cities, and Spain’s Barcelona is pursuing an ambitious vision.

“It’s about building smart cities that think about people first,” said Francesca Bria, president of the Italian National Innovation Fund and former CTO of Barcelona. That includes asking questions like, “Smart for whom?”

Barcelona, Bria noted, focused first on people rather than technology and data and sought to increase citizen participation in city-related decision making, including through Decidim Barcelona, a free, open-source digital platform that promotes citizen initiatives, a more transparent, participatory municipal budgeting process, and other efforts.

But in Barcelona, 10,000 households live in “digital poverty,” as noted by Laia Bonet, deputy mayor of Barcelona; too many citizens have no access to the internet and other digital resources.

To address the digital divide, city leaders are assessing whether providing access and devices is sufficient, or if some families need further support such as digital training. “Digital inclusion is our first pillar,” Bonet said.

City leaders are also focusing on who develops technology and makes related decisions—a challenge when women represent only 26 percent of technology-sector jobs, as Bonet noted. The city is actively training women to work as full-stack developers in partnership with businesses. Leaders are also working to apply AI technology to activities like the city council process, with the goal of democratic monitoring and full transparency for the technology.

Blockchain and Racial Equity

There is increasing discussion around blockchain technologies and racial equity. Speakers called out potential challenges of using blockchain to verify identity, for example, and the inherent inequality reflected in development and use of cryptocurrencies like bitcoin.

Blockchain, despite its decentralized approach to transactions, is still based on a history of accounting technologies that were essential in modes of capitalism that depended on the fixing of people and place for transactional value, said Bill Maurer, dean and professor of anthropology at the University of California, Irvine.

“Built into the code, built into the system are these ideas of neoclassical economy theory and a foreclosure of other kinds of economic possibility, like cooperation or common pool resources, or some other kind of system of resource governance that would allow different modes of participation,” he said. We should see these things as part of processes of racial capitalism, he noted. “It doesn’t mean the legacy sets a path dependency, but there may be a constraint on the technological innovation by relying on systems that derive from very particular economy theories.”

Elizabeth Renieris, Human Rights Fellow at the Harvard Kennedy School and a CCSRE/DCSL practitioner fellow, asked the question, Who is blockchain extracting value for? “The core developers of these technologies are all white men. We need a more inclusive view of society and the future, rather than further concentration of power.”

The argument extends to blockchain-enabled cryptocurrencies like bitcoin. “It’s still mostly white men moving markets,” said Kortney Ziegler, founder of Green Kandle Academy and other organizations and a CCSRE/DCSL practitioner fellow. “Cryptocurrencies become valuable through the power of networks, access, and wealth, which can hurt smaller retail traders. Buying bitcoin doesn’t erase inequality.”

Still, Ziegler sees the growing role of decentralized global finance in promoting equity, such as when young Nigerians used bitcoin to fund social justice protests in 2020.

Digital Emancipation in Latin America

How can we shift the balance of ownership of technology from giant corporations to smaller communities, including indigenous populations?

Peter Bloom, founder of Rhizomatica, is pursuing answers to that question, collaborating with Latin American indigenous communities to help them develop and harness technology better. “We want to make technology culture strengthening,” he said.

For example, he’s working with subsistence-farming communities in Oaxaca, Mexico, to develop their own local access to digital airwaves, bypassing expensive and spotty service from a giant telecom. “We have to question a regulatory framework that allows only large companies to provide connections to people,” Bloom said. His team carries the work out through community assemblies and design groups that consider and select solutions, with lots of iteration.

“I want to help communities define their technological futures instead of just adapting to technology that arrives there,” he continued. “The future is communitarian.”

Racial Justice in Digital Agrifood

The pandemic has highlighted the criticality of food workers, including growers and farmworkers. Panelists spoke about promoting racial justice in the digital agricultural domain.

“The current conventional food system is racist, sexist, classist, and environmentally toxic—by design,” said Samir Doshi, former deputy division chief for USAID's Global Development Lab and a CCSRE/DCSL/HAI practitioner fellow. “It’s a systemic issue.”

He pointed to the negative environmental impacts of big agriculture—deforestation, water pollution, others—and how little land Black farmers own in an increasingly privatized, corporatized food system. Agtech may mean digital innovation, but too much of it promotes inequality. For example, there is evidence corporations have used digital, GPS-based technologies to illegally appropriate land, dispossessing underserved communities of land used for living and farming.

Sarah Rotz, York University assistant professor of environmental and urban change, said, “Small-holder farmers need many things much more than the tech being promoted to them right now, which is often capital intensive and deepens inequality.” She studies systemic bias in agriculture, such as the fact that there is a disproportionate collection of data on corn—a crop purchased by many large corporations—over data that could be used to improve other, less commercial crops.

“Digital technology is enabling a ‘farming as a service’ model,” Rotz said. “Farmers do the labor but don’t own the land, and knowledge is transferred to technology enabling surveillance of laborers—impeding their well-being.”

In this context, Ghana-based Growing Gold Farms seeks to promote “food justice through sustainability,” as described by David Selassie Opoku, Growing Gold’s co-founder and a CCSRE/DCSL practitioner fellow. He argued that the current commercial agricultural system produces high profits for large-scale input-providers and food-processors, leaving the farmers who actually produce the food with little.

“Subsistence farming is vilified,” Selassie Opoku said, “but it is actually more communal than commercial farming and reinvests profits locally.” Ghana-based Cocoa360.org, for example, seeks to promote community ownership of cocoa farms and to use technology to improve yields and other outcomes for a more socially just model.

Police Technology and Global Surveillance

Shakeer Rahman, a lawyer with the Stop LAPD Spying Coalition and a CCSRE/DCSL practitioner fellow, framed discussion of the impacts of police technology on activists and community work using three lenses: complicity, accountability, and authority. “It’s about decentering this technology and thinking about the power and politics surrounding it,” he said.

For example, Jamie Garcia, a registered nurse and an organizer for Stop LAPD Spying, has facilitated campaigns against “data-driven policing”—such as predictions of where crime will happen—by marshaling community-based resources to build a culture of resistance and understand the roles institutions play in furthering or mitigating inequity.

Similarly, J. Khadijah Abdurahman, director of We Be Imagining, a public-interest technology project at Columbia University, described “infrastructures of surveillance” built on digital technology as present in social arenas well beyond policing, such as when caseworkers enter data into tablets during a client home visit. “We need to map out this power and understand the connections among institutions,” Abdurahman said.

Sucheta Ghoshal, a University of Washington assistant professor of human-centered design and engineering, brought a global frame to the challenge of surveillance by discussing India’s large-scale identification system, Aadhar: “It was presented as supporting a welfare pipeline but ended up being a massive surveillance and security risk used for religious/caste segregation.” She highlighted, also, that the most authentic critiques of technology come from nonexperts, who, for example, have made accurate past predictions about the oppressive uses of technology.

In closing, Abdurahman drew an important distinction: “Let’s distinguish between technology and computation. I’m not anti-tech, but I’m concerned we just use data to extract insights now, without applying the scientific method.”

Reimagining the Default Settings of Tech and Society

Benjamin, a Princeton University professor of African American studies and the founding director of the Ida B. Wells Just Data Lab, delivered the conference’s keynote speech—the 16th Annual Anne & Loren Kieve Distinguished Lecture.

She shared how an online storytelling session she led in March 2020 became one of the first Zoombombing targets, with the perpetrators sharing racially offensive images. “Zoom and other technologies weighted convenience over security back then,” Benjamin said.

Technology design, Benjamin said, should “take the perspective of those harassed by technology systems—and use communities of color as architects.” In this context, the minimalist white space of design aesthetics—such as retail-app pages—reflects the “white space of design,” influenced by dominant white culture.

Benjamin said such issues raise the broader question of “Who gets to future?” as asked by Jasper Tran O’Leary—how can marginalized groups have more of a say in evolving technologies? “We have to move beyond tokenistic inclusion,” she said. “Past the ‘add-color-and-stir’ approach.”

There has been progress in technology-focused inclusion. Consider the high-profile ad featuring actor Michael B. Jordan as the personification of Amazon’s Alexa. “But while that ad ran, Amazon was trying to crush unionization in a Bessemer warehouse,” Benjamin said. And even robots face racism, as people shoot more quickly at a black-color robot.

Health care and hiring algorithms, too, continue to be biased against Black individuals. “Imagine,” she said, “a white-collar crime early-warning app that showed where financial crimes were more likely to occur and includes criminal profiles based on thousands of executives on LinkedIn.”

She advocated collective and individual efforts toward technology justice, such as an EU group’s recently developed 10 Principles for Workers’ Data Rights and tech workers speaking out against problematic technologies such as controversial defense innovations.

“If inequity is woven into the very fabric of society, each twist is a chance for us to weave new patterns, and its undoing will be to accept that we are the patternmakers,” Benjamin said.

Stanford HAI's mission is to advance AI research, education, policy and practice to improve the human condition. Learn more

More News Topics

Related Content