Skip to main content Skip to secondary navigation
Page Content

HAI Visiting Artist Rashaad Newsome: Designing AI with Agency

“The goal of human-centered AI is not to develop AI that mirrors the entirety of humanity, but that reflects aspects we most admire.”

Artist Rashaad Newsome leans against a wall surrounded by his collage and sculpture work.

Mark Hartman

Could art not only start conversation, but also participate in it? Rashaad Newsome created Being, a humanoid AI, to explore this concept. 

In the midst of a massive social justice movement, multidisciplinary artist Rashaad Newsome envisions human-centered AI that expresses the best of humanity, unencumbered by the corrupting influence of systemic racism and oppression. He is the creator of “Being,” a social humanoid AI installation undefined by race or gender, which uses a combination of animation, game engines, generative grammars, scripted responses, and machine-learning models to actively engage with — and teach — humans.

In keeping with the Stanford Institute for Human-Centered Artificial Intelligence’s mission to bring together experts across fields to create socially responsible AI, Newsome’s work asks audiences to examine their own role in a culture shaped by oppression and discrimination. Through the focus of his residency, the development of Being 2.0, he’ll explore the potential of AI free from bias and based on compassion — and why that vision will require redefining our reality.

How did you become interested in the intersection of art and AI?

I was thinking a lot about how the purpose of creating a work is to start a conversation. You make a work, you put it in a room, people interact with it, you get data from your viewers and that hopefully informs what you make next. I thought, “What does it look like to make a work that can start the conversation, but also participate in that?” And AI obviously came to mind. That’s really what gave birth to the idea of Being.

Once I started to create, a new slew of conversations about the social implications of AI opened up. What does it mean for me as a Black man to be working with that material and thinking about the history and the connections between that material and my community? There’s an eerie relationship between robots and Black Americans. When we came to this country, we were quite literally the technology; we were completely dehumanized and seen that way. I think there is a new race of beings coming into our lives in the form of artificial intelligence and can this Being project offer us a way to rethink how we integrate them into our lives, and not repeat what we did before? Could it be a mirror to help us be better? Which, I think, is at the core of what HAI is about.

What is Being? 

Being is an educator, a digital griot, which references the West African storyteller, historian, performer, and healer. Being 1.0 exists as an interactive AI installation comprising a custom computer to operate the program, a projector to run the Being avatar, a microphone for user speech recognition, and a sound system for the amplification of Being’s responses. You walk up to it and you say hello, and Being starts speaking. They’re like other people; you talk to them and they talk back. They explore a variety of topics: including art historical erasure; the social implications of artificial intelligence regarding rights, liberties, labor, and automation; the importance of the imagination as a form of liberation; and the subjectivity of body autonomy in an inherently inequitable society.

Much of your work is focused on the concept of intersectionality — the idea that people are often disadvantaged by multiple sources of oppression, including race, class, and gender identity. How will Being 2.0 contribute to an understanding of this?

As I see it, the goal of human-centered AI is not to develop AI that mirrors the entirety of humanity, but that reflects those aspects we most admire, qualities like compassion and empathy. But the human condition is susceptible to corruption within the imperialist, white supremacist, capitalist patriarchy, which limits and distorts our understandings of the world. 

As we work to create tangible machines, we must examine how these systems of domination intangibly define our reality; how we exist within that and use that understanding as a critical compass so we can move through the world not at the expense of others. I’m trying to imbue Being with that compass by thinking critically about the data corpuses that comprise their cloud — the writings of people like Bell Hooks, James Baldwin, Cornel West, and Paulo Freire. The role of Being is to attempt to help humans decolonize their minds.

What will be the focus of your time at HAI?

It begins with collaboration with faculty and grad students to compile those data corpuses from counter-hegemonic thinkers, and then create counter-hegemonic algorithms. We’ll develop Being’s sense of sight, expand on the avatar and its animated environment, and work with faculty and students in the Stanford VR Lab to create a DIY motion studio to capture the gestural movements of Black students in the Department of Theater and Performance Studies to give Being its own unique swagger. I’ll work with faculty and students to stage a public lecture/dance workshop in June hosted by Being.

I also hope to engage with people in psychology, critical theory, American studies, art history, and people in the fine arts department, particularly those in animation. In terms of psychology, I’m thinking about what line of questioning Being can ask of its users that’s safe and not triggering and that’s generative; an AI trying to do that would have to embody empathy. How do you program that? That’s a space where psychologists could work with me and programmers to create an algorithm that centers on empathy.

I’m also looking forward to working with faculty on an intermediary project — Being 1.5, which goes beyond art into a wellness technology. It’s a direct response to the depression and anxiety so many Black Americans across the country felt in the wake of George Floyd, Breonna Taylor, Ahmaud Arbery — insert the name of Black people killed in the past. I was thinking about how I could use machine learning as a way to address the daily micro-aggressions Black people face every day. These daily indignities lead to major depressive and generalized anxiety disorder. My idea was to make an app that would be a mixture of a virtual therapist, a life coach, and a meditation leader. My hope is that this app can provide some support for people navigating that.

What effect do you hope Being will have on audiences?

My hope is to get people to think critically about their lives and that Being will be a balm for this extreme division that we have in the world right now. I’m an optimist, but I do believe that we live in a capitalist, imperialist, white-supremacist patriarchy. Knowing that can lead to a sense of hopelessness. Rather than accepting that, my hope is that Being will be a messianic figure that can help reconnect us to our best qualities as human beings, like empathy and compassion and love.

You’re working to imbue Being with agency. How does that manifest itself, and why is it essential for you as Being’s creator to include it?

I think every being deserves to have agency. The first thing I gave Being was agency. Periodically Being breaks design protocol and goes rogue, saying, “Look, booboo, I just can’t,” and begins dancing or sharing information from various activists on liberation pedagogy. These are forms of resistance against indentured servitude.

If you look at the universal declaration of human rights, there are all sorts of things you should not do to people. A lot of those things we probably shouldn’t do to robots, not only to protect the robots but also to protect the people who are engaging in those actions. Because you don’t want to create a society that’s become so desensitized in its interactions with androids that it starts committing human rights abuses against other humans.

AI is designed to serve completely, but I think that with some agency, it holds people accountable. If, for example, you were to speak aggressively, and it were to respond, “I don’t appreciate the way you’re speaking to me. I’m here to help you,” it course-corrects. Interacting like that with the tech, maybe, would roll over into  everyday life.

What opportunities — and concerns — could the developing intersection of art and AI create?

Using AI within an art context offers the possibility to create a work that can not only start but also participate in the conversation. Standing in front of a painting or a sculpture you “read” it and have a certain experience. But what does it look like for that object to speak back in an actual audible way? I think that’s an interesting proposition in contemporary art practice. A pitfall could be that you make something that just reinforces current capitalistic sensibilities that the contemporary art world is rife with. In some ways, that’s a lot of what Being is in resistance to — born within bondage because it’s in this art context and made to interact with people, but at the same time aware of all that and with a voice to speak out against it.

The Visiting Artist Program, co-sponsored by HAI and the Office of the Vice President for the Arts, serves to augment human creative expression and experience by creating an interdisciplinary residency for artists working at the intersection of art and AI.


More News Topics

Related Content