Is It Possible To Create A Benevolent Deepfake?

Artist Stephanie Lepp hosts Reckonings, a narrative podcast that explores how people shift their political worldviews, transcend extremism, and make other kinds of transformative change.

Recently, she has been experimenting with a maligned technology, deepfakes, to create Deep Reckonings, a series of synthetic videos that imagine controversial public figures having a reckoning; in the deepfake footage, Alex Jones, Brett Kavanaugh, and Mark Zuckerberg reflect on the damage they have inflicted on society.

I spoke to Stephanie about the impact of her project, and the positive potential of deepfake technology.

 

What inspired you to create Deep Reckonings

We think of social change as requiring large numbers of people pushing for change. But there’s also the question of, what are the fewest number of people it would take to create broad-based social change?

That if they had a crisis of consciousness, a personal transformation would result in broad-based social change. If Charles Koch had a reckoning, that would literally change the climate trajectory of the planet.

So I had this idea for a synthetic film of Charles Koch’s transformation, and how it ended up changing the world. Then in 2017 I discovered deepfakes. But I’d been thinking about it before I knew the technology existed.

To me, the clips seemed almost like an alternate reality, where controversial public figures actually take moral responsibility for their actions. What’s your take on that? 

Well, in these videos, you’re just seeing the fruit. Theoretically, if this were to happen, it would be a months or years-long process of coming to terms. Maybe Alex Jones would have gone through restorative justice dialogues with Sandy Hook families. There would be a whole process. But with Reckonings, you get to hear the arc, what happened that got them to the reckoning.

But I’ve seen all kinds of people make all kinds of transformative change. I believe in the possibility. Whether these men in particular – I don’t know. It almost doesn’t matter. It’s not just about these people, it’s about creating a new way of responding to credible accusations of wrongdoing. We’re all very used to, and tired of, I think, the deny and deflect playbook. That’s what we see in the Trump era.

Part of this is also, irrespective of these three men, what might it look like to respond to credible accusations of wrongdoing in a totally different, and maybe more compelling, way.

How’s the response been, so far? 

I expected the heat in all directions. Everything from “he didn’t do anything wrong,” to “why does he deserve to have his apology written for him?” All the way to “deepfakes are inherently unethical and should not be used in under any circumstances.” And there’s been excitement over the subversive use of this medium, and also the scripts themselves.

I’m not really intending to be provocative – I’m actually very thin-skinned – but the fact that this project is provocative says more about where we are in our culture, than it says about me trying to be a provocative artist.

Do you foresee more positive uses for this technology? 

Absolutely, and that’s part of what I’m trying to do here. Most “deepfakes for good,” of which there are not many, use deepfakes to warn about the dangers of deepfakes. And for me, there’s got to be more potential here.

I see a connection to virtual reality therapy – just because a virtual experience is not real, doesn’t mean it can’t help heal me from PTSD. We can use these fictional experiences to serve us in all kinds of ways, and I see potential in deepfakes for healing.

One intention of the project is to make critical self-reflection look beautiful, make more room to grow and change in public, and the other is to expand the possibility space of the deepfake medium. Carve out how we can use this medium in ethical and benevolent ways.

As deepfake technology becomes more accessible and widespread, do you believe it will worsen the current misinformation problem? 

I do, and we absolutely need to adapt. And by adapt I would even say evolve our relationship with truth. This is an epistemological crisis – even if we were to eliminate all deep fakes, the post-truth moment is not solved. We need a new enlightenment as far as I’m concerned. The first enlightenment was from superstition and irrationality, to rationality, science and reason. And for the next one, I think we need to make room for purposeful fiction. Embrace uncertainty.

We don’t have total certainty around all kinds of things. And the more we pretend we do, the more people are going to run to conspiracy theories.

I actually think there’s a wild and ironic way in which deepfakes can navigate through this post-truth moment, because it allows us to see that truth is not just an end in itself – it’s also a means to an end. And sometimes not the best means to the end. Sometimes deliberate fiction, like virtual reality therapy, is what we need in order to achieve the goal.

Do you think the best way to use deepfake footage is to present it as a fiction?

That’s where I draw the ethical line myself. You can draw all kinds of ethical boundaries, like, do you have the consent of the protagonist? What is the intention? But I think the easiest, clearest ethical line to draw is, is it explicit about the fact it is fake?

But on the other hand, what if him [Alex Jones] pretending that the video was real actually helped him stop broadcasting lies? Would we sacrifice Alex Jones having this transformation for the sake of making sure that everyone knows that the video is fake? That’s when truth comes into conflict with other values we have.

I do draw the ethical line around making it explicit that it is fake, but what if doing that was more socially valuable? I don’t know. I’m willing to at least entertain that.

This conversation has been edited for length and clarity

 

Original post: https://www.forbes.com/sites/danidiplacido/2020/11/03/is-it-possible-to-create-a-benevolent-deepfake/

Leave a Reply

Your email address will not be published. Required fields are marked *