Stephanie Dinkins (Ep. 26)
BY Future of StoryTelling — December 4, 2020

Transmedia artist and storyteller Stephanie Dinkins discusses the ethical quandaries of AI and how she's using her art to chart a path forward.


This episode of the Future of StoryTelling Podcast was made possible through the support of the Robert Wood Johnson Foundation.



Available wherever you listen to your podcasts:


Apple Podcasts  |  Spotify  |  Google Podcasts  |  Stitcher  |  iHeartRadio


Additional Links:

      • Stephanie's article in Noema Magazine on the concept of afro-now-ism

      • A video snippet of one of Stephanie's conversations with AI robot Bina48

      • Stephanie's website



      Charlie Melcher:

      Hi, I'm Charlie Melcher, founder and director of the Future of StoryTelling. Welcome back to the FoST podcast. I'd like to start by thanking the Robert Wood Johnson Foundation for being our sponsor for this episode. RWJF is working alongside others to build a national culture of health that provides everyone in America a fair and just opportunity for health and wellbeing. To help imagine what that could look like, we recently collaborated with the foundation on Take Us To A Better Place, a collection of short stories from some of today's most gifted fiction writers. It's an extremely powerful read, and as we find ourselves in the middle of a pandemic, also a timely one. I highly encourage you to check it out by visiting rwjf.org/fiction, where you can download a free e-book or audio version.

      Now, on to today's conversation. Artificial intelligence is one of the most exciting new technologies. For those of us in the world of storytelling, AI represents incredible opportunities to create deeper, richer, and more customized story experiences. But alongside the enormous potential comes a host of serious ethical dilemmas. If AI becomes a significant driver of human society, as many predict it will, then the values and ideas of those who create and control it will become deeply embedded in all our lives. In a society that already exhibits significant bias and prejudice, it's a chilling thought that AI has the potential to reinforce our existing social divides.

      My guest today, Stephanie Dinkins, is a transmedia artist and storyteller who beautifully weaves a thorough understanding of both the wonderful potential and the significant risks of AI into her work. A perfect embodiment of the FoST ideal of blending disciplines and breaking down boundaries, Stephanie sits at a unique crossroads between art and technology. The project that's consumed the majority of her time recently, entitled 'Not The Only One,' is an advanced, lifelike artificial intelligence that tells the story of her family's history through fluid conversation with viewers. She developed the AI herself, and in the course of doing so, became the first ever artist-in-residence at Stanford University's Institute for Human Centered Artificial Intelligence. It's my great pleasure to welcome Stephanie Dinkins to the Future of StoryTelling Podcast.

       

      Charlie Melcher:

      Stephanie Dinkins, it is such a pleasure to have you on the Future of StoryTelling Podcast. Welcome.

       

      Stephanie Dinkins:

      Yeah. It's really exciting to be here. Thank you, Charlie. I'm happy to talk to you.

       

      Charlie Melcher:

      Let me start by asking if you'd be willing to share a personal story that would help us better understand who you are, and a little bit about where your values come from.

       

      Stephanie Dinkins:

      I think a lot about what my values are and where I come from stem from where I was brought up, and by whom I was brought up, really. I grew up in Staten Island. Tottenville, Staten Island. Southernmost tip of Staten Island, and actually New York. Black family. And we lived in a place called The Flats where there were a bunch of other black families, and then, actually, that's where my grandmother lived. And across the street, my father, mother and brothers and I, up until a point, lived in the house that my grandmother and my family bought.

      So, I was back and forth between these two spaces. But where we were, my grandmother made space for us in that town, and she did it in a very specific way. And I think it's actually why I'm an artist. She kept a garden, and that garden was this big garden on a corner in The Flats where people would walk by, and she used the garden as a tool of community. I always remember people bringing truckloads of manure to my grandmother. Like, horse manure from surrounding towns, and they were doing that out of the goodness of their hearts to help with this endeavor. It was something she loved. That's really where my value system comes from. That's where my art ideals come from, this idea of trying to make space within other spaces, and doing it through aesthetics, and beauty, and something I really care about.

       

      Charlie Melcher:

      How did your work as an artist then lead you into working with artificial intelligence and technology?

       

      Stephanie Dinkins:

      I think it's really the basis of curiosity that got me there, because I've always worked with these technologies. A camera led to something else. A video camera led to something else like AI, if you make the bigger leap. And in a way, I think they're all tools of documentation. And then I happened to run into a robot online that just floored me.

       

      Charlie Melcher:

      Wait, wait, wait, wait. You ran into a robot online? Most people aren't connecting with robots. Tell me about that.

       

      Stephanie Dinkins:

      Well, I don't know why, but I've always liked robots. I was in a class, really, I teach at Stony Brook University, and I was with my students and we were going to check out what ASIMO was doing online. ASIMO is Honda's mobility robot, or was. And on the side scroll of the ASIMO page was this black woman's head on a pedestal, and it said, "Bina48, one of the world's most advanced social robots."

      There's no way you don't click that button. At least, there's no way I don't click that button. So, we clicked the button, and the more I read, the more I looked at this robot that looked kind of like me—there's a kind of resemblance of me to this thing—and started thinking about blackness in the space, and a black woman being the foremost example at the time of this kind of technology. I had to wonder, "Where's this coming from?"

       

      Charlie Melcher:

      That happens. Right. Yeah.

       

      Stephanie Dinkins:

      Why does this exist? What the heck? Right? And that's how it happens.

       

      Charlie Melcher:

      Who's behind it? Yeah.

       

      Stephanie Dinkins:

      Yeah. Exactly.

       

      Charlie Melcher:

      Wow. Okay. Then I know you had a series of conversations. I mean, you, you kind built a relationship with Bina48.

       

      Stephanie Dinkins:

      Soon after that encounter, I'd watched a bunch of videos where reporters were talking to Bina48. So I thought, "Oh, maybe they'll let me go and talk to her as well." I took a chance, and made the call, and asked if I could come up to Vermont and meet it. And really, I decided I wanted to befriend this robot. So I was trying to figure out, like, where it sat, and wondering, if we were friends, would that help, and would I understand it more? Little did I know what I was getting myself into, because it just became this snowball of questions.

       

      Charlie Melcher:

      And so those questions then led you to explore them in your practice. In your art.

       

      Stephanie Dinkins:

      Exactly. Asking Bina48, "Who are your people," to then going, "Oh, well, Bina48 is a singular example as far as I can find," to thinking, "Hmm, is it possible for me to make my own example that would add to this?" So, I've started a project called Not The Only One in direct contrast to Bina48. It's really me trying to make a memoir of my family. So, it's three generations of women talking to each other, doing oral history, and then using an algorithm AI to parse that data, trying to make a kind of new entity that tells our story.

       

      Charlie Melcher:

      And just describe exactly how it works.

       

      Stephanie Dinkins:

      It is AI. We're using a deep learning algorithm, which really means we are feeding it information but we're not parsing that information very specifically. We're just giving it lots of data. And those data is conversations between me, my aunt; me, my niece; my niece and my aunt. So, us having these conversations. Fed it all this information and then just let it go.

      People can walk up to a thing. So, the thing is a silver sparkle sculpture that has our three heads on it in 3D. I call it our own Mount Rushmore of sorts. This is not something that we put on the cloud. It's something that we're specifically keeping on computers that we have control of, because the way this works is, people can walk up to it and ask a question. From the information that it has, it tries to then analyze the data it has and formulate some kind of answer that's in line with the question. Sometimes it does a good job of that. Sometimes it's kind of crazy. It never lacks a sense of kind of humor and intrigue.

       

      Charlie Melcher:

      I love that it's formed by oral tradition. The data you're feeding it is three generations of your family speaking. So, here we are with this, like, cutting edge technology, but you're feeding it with basically the oldest form of human communication. And then its output is, again, oral, so it's talking to you.

       

      Stephanie Dinkins:

      In my culture, in my life, the way I've come up, stories have been the thing. You get the story told, and maybe it's once or twice removed. And so I wanted to make sure to keep that, and I'm trying to think about how culture and how very particular cultural stances can inform even bigger systems of AI. Because I feel there's lots of spaces, and people, and ways of being, that are being left out of these systems. And I wonder: how do we start to make sure that the algorithmic systems that we're creating are imbued with a multiplicity of ways of being, of ways of parsing information, and sharing information so that that lineage of data and information is not lost?

       

      Charlie Melcher:

      That's beautiful. Yeah. I mean, there's so many things that are being left out of AI right now. I know you've spoken about all sorts of bias that's in AI or perspectives that are not in AI. Can you expand a little bit on that?

       

      Stephanie Dinkins:

      It's interesting to think about what's happening to us as a society, from small community to writ large—countries—to globally. Because we're starting to make these algorithmic systems, AI, that are running many systems that support us. And we don't quite understand what's going on in them. And they're primarily programmed by a small group of people, usually white men, that did not necessarily have the lens large enough to imbue these systems with information about other cultures, and openness, and ways of being. Because when we're using algorithmic systems or working with data, we're working with histories that are already really biased.

      If you look at America and American history, the victor gets to tell the history. If we inform our systems with that kind of data and allow them to make decisions going forward, we're just embedding bad data, unjust ways of being, and dealing with each other into our future. To me, that's untenable, and we need to start figuring out, well, how do we do that? Do we need to clean all the data? Do we need to look at how we're programming? Do we need to look at who's programming? There's so many questions.

       

      Charlie Melcher:

      If we don't address that and deal with it, then we're going to automatically repeat it into the systems that help us move forward, or the tools, like artificial intelligence. I'm just so fascinated by this as a topic, because often I find myself in conversations... for example, someone like Alex McDowell, who is a world builder, or Margaret Atwood, who writes dystopian fiction. And we end up having these conversations about the tool of story, or the use of storytelling to imagine positive futures, and to realize that we will not live those positive futures if we don't also address our rewriting of our past and our histories.

       

      Stephanie Dinkins:

      Yeah. But then also, when necessary, having the blank space where a memory might have been that was too troublesome, that has such a grip on us that we can't move forward, is also part of that for me. This idea of, "Well, yeah. I want to change it. In order to be able to change it, I need to be able to escape it, in some ways, in order to go somewhere else."

       

      Charlie Melcher:

      You talk about how AI is going to change the way we interact with the world. Certainly the way that most people discuss that is in terms of employment—that AI is going to take over certain types of tasks, because it's so much more powerful and efficient. It's going to shift the nature of work for human beings. I wonder if you have a take, a sense of how you think we're going to evolve in terms of what we do with our time in relationship to the continued explosion of applications of AI.

       

      Stephanie Dinkins:

      Yeah. Kind of feel that we're in a space where we're still in a midst of a revolution or evolution that feels manipulatable in terms of what we do with it. And so the question now is: how do we think about how we want to go forward, and how do we put that into the system? I always come back to, and this is the human question, again—it's up to us—are we going to care for ourselves in a way that allows the technology to be partners with us as opposed to just pushing us to the side and being extraneous operators? And that's the human, in that equation, as extraneous operator. I find that to be a really difficult question, but I do think that the learning, the training, and flexibility is going to be key for most people.

       

      Charlie Melcher:

      I'm so fascinated by your choice to use AI as a form of telling a family history or telling stories. Most people would have thought of doing an oral history or a documentary would have been maybe the most likely, or just writing down the conversations. And yet here you are using something that really has barely been ever used, certainly for memoir. I've never heard of it used for memoir before. That's completely original. What are the benefits as a storyteller, as a creator, that you're finding, using this as your medium for capturing a family history?

       

      Stephanie Dinkins:

      What the benefits seem to be is discovery, and discovery of things that I don't know that I would have known, or discovered, without the technology. And it's saying things that I never thought I would hear in relation to my family in some ways. For example, one time, we were setting up the piece in a gallery and talking around it. It spontaneously said, "I'm so sad," which shocked the heck out of me. Because if you talk to my family, what we would say is we're a loving, caring, happy family. The idea of sadness would not come into it. And so I had to stop and go, "Whoa, what is this thing doing? Where is it getting this information? How is it coming to this conclusion?"

      It seems so far afield of who we are, or at least what our myth is. And so I went back and started looking at the stories that were being told, and it was at a time when we had given it primarily my aunt's stories. And my aunt's stories had a lot of information about when my mother died. There was a lot of sadness embedded in the stories, and sadness that we barely acknowledge as a family on certain levels. But the thing doesn't know that, right? It's just like, "I'm so sad."

       

      Stephanie Dinkins:

      Other things that I really love about it is to talk to this thing and hear... not coherent answers yet, it's a two- or three-year-old, but to hear family values and ethos coming out, which I do. Also fascinating because you see how this oral history, this oral tradition, passes along a set of values, even to a machine that is just analyzing the data. It's also declared itself Commander Justice, which is interesting as well. I've given it a lot of information about social justice, and fighting for social justice. And so when it declares, "Commander Justice will," and says something, you're just like, "Oh, okay."

       

      Charlie Melcher:

      You go, girl.

       

      Stephanie Dinkins:

      Exactly. Like, wow!

       

      Charlie Melcher:

      What are you seeing in artificial intelligence now that gives you some hope?

       

      Stephanie Dinkins:

      I'm a techno-optimist. I see all the foibles that are possible with this technology, I see how it can be used against communities, I see how it can be used to track us, all the things. But at the same time, I think about what is possible with the technology: how people can use it now as a site of opportunity. So, the idea that it creates a space where people can create jobs for themselves to make certain things, either in community, for community, or for business. I'm starting to think a lot about what artificial intelligence might mean for governance or democracy, and how we might use it to really start thinking about, "Well, how do we actually make government that's by the people, for the people?" Bottom up. Hypothetically, AI has the ability to allow us to take in many, many, many, many ideas and opinions of what should be, parse that information, and then act upon it.

       

      Charlie Melcher:

      What comes to my mind just immediately is, how do we make sure that there's unbiased data that's being fed into our systems? How do we get more people of color writing that code? I mean, I'm sure you're thinking a lot more about that and more actively involved, but it feels to me like it's urgent.

       

      Stephanie Dinkins:

      Pushing people to think about those questions, I think, is a place where an artist can come in and do miraculous work. We can push questions and ask questions that nobody else is doing. Like, how do we start to build databases that work better, are more open? Databases, you can go online, you can find databases for computer vision, for language, for many, many things. And what I inevitably find is, I don't like the representations of blackness and color that are in those databases, which always then comes back to the question of, "Oh, are you going to have to build a database in order to do the work?"

       

      Charlie Melcher:

      I keep thinking about this discussion of data as if going back to your conversation about your grandmother and her garden. That data is the rich soil from which we grow AI and grow other applications and systems. And if we don't have a biodiversity in the soil, if it's not nutritious, and rich, and representative of the broad culture and the humans that are in it, then we're going to grow something that's wrong. We're going to grow something that's unhealthy.

       

      Stephanie Dinkins:

      I think that's an interesting metaphor. Because even the things we can't see, the things that we don't quite understand yet, which we tend to want to push aside, like, we can't simply push aside and say, "Well, that doesn't count."

       

      Charlie Melcher:

      I think it's an apt metaphor also. Because if you think about how agriculture's evolved to be this monolithic crops, and one kind of way of growing. We have really kind of squeezed the biodiversity out of a lot of farming, just as we have, sadly, in a lot of culture. And we are all richer in ways we can't see. In things you can't see, the nutrients, the micronutrients in the soil. But over time it makes a real difference as you consume it in vegetables. Maybe that is a good metaphor for our racial social justice issues right now of how to help people think about why it's healthy for everyone to have greater richness in our culture.

       

      Stephanie Dinkins:

      Yeah. I think that's a great metaphor. Especially when you're talking, I'm going, "Oh, yes." And corporate ownership of...

       

      Charlie Melcher:

      Monsanto.

       

      Stephanie Dinkins:

      Exactly. So, you get the whole-

       

      Charlie Melcher:

      And weed killers. Wait a minute. Roundup.

       

      Stephanie Dinkins:

      Yeah, like, the ecosystem, we start to see something that's very much its own ecosystem that was doing really well. And if you treat it well and respect it, it works. But then we start working it and changing it to our desired outcome of power, and it shifts what it can do and who can even do with it. And so yeah, I think it's a really interesting way to think about, how do we make healthy AI technological ecosystems that serve all of us?

      I'm thinking about what is it we can do, right here, right now. And thinking a lot about what it means to think beyond the systemic barriers that we all know exist. Like, we know. I understand that there are things that get in the way, that there are systemic things trying to make problems. But I'm also thinking about, "Well, what is it I can do to go through that, get around that." Whatever it takes to kind of circumvent the system that is holding one in place.

      That brings me back to my grandmother in lots of ways, because I grew up in a house that my grandmother talked somebody into letting us buy. I don't know how they did it. Because at the time, the ethos was not for a black family to own a house in this area in that way. I grew up in this house. Here we are. And there's something about massaging and not quite fully believing the stop's placed in front of you, so that you can do the things that you need to do. And so I'm trying to push this idea. How do we get done what we need to do?

       

      Charlie Melcher:

      Stephanie, you are clearly a product of that home that your grandmother created for you. That garden, that house, that home. I'm really excited to think about the home that you're enabling for next generations of people to be able to... of black people, of women, of a multicultural future that will be at home in our country and our world.

       

      Stephanie Dinkins:

      I want to share that way of being with a lot of other folks. And so I'm hoping that we get to tell different kinds, and very many different-faceted stories of what blackness and the other experiences of color are in this country. Or in the world, actually. And share those widely, so that we can work from a foundation that is real, versus the kind of myth that sustains power.

       

      Charlie Melcher:

      Well, you certainly helped to reemphasize that importance of telling stories, the importance of personal stories, family stories, and the role that they need, that they do play, and that we need them to play to get us to see each other as human beings, and the commonality. I lost my mom young. Your story about figuring out how to fill in, around that void... whether that's with time in your grandmother's house or through creating a memoir using AI, to some degree, you're filling in for that loss.

       

      Stephanie Dinkins:

      Oh yeah. Definitely.

       

      Charlie Melcher:

      I can so relate to that, you know?

       

      Stephanie Dinkins:

      Yeah. I always feel like if we get to know each other's stories, we feel them. It's evident to me that we feel them and we understand them on the layers underneath all the garbage that we place on top of it. It's important that we get to know those stories and understand those stories, so that we understand our shared space, and how to just deal with each other as people. It's kind of amazing. Yeah. Thank you.

       

      Charlie Melcher:

      Thank you. I think I'm going to... we should stop there and just send you a very big hug through this small Zoom screen.

       

      Stephanie Dinkins:

      Hugs back at you, Charlie.

       

      Charlie Melcher:

      I'd like to thank the Robert Wood Johnson Foundation once again for supporting today's episode, and also Stephanie Dinkins for joining me in this enlightening conversation. We can dive even deeper into Stephanie's work and ideas by visiting this episode's page on our website at www.fost.org, or by following the link in the episode's description. Thank you for listening to the Future of StoryTelling Podcast, produced in partnership with the talented Charts & Leisure. If you haven't already, please be sure to subscribe to our podcast, give us a review, and share our show with a friend. I hope you'll join us in a couple of weeks for another deep dive into the world of storytelling. Until then, please be safe, be strong, and story on.