Skip to Main Content
Can AI help us with issues that divide  society?
Monday 24 July, 2023

Can AI help us with issues that divide society?

by Alice Wroe

A few weeks ago I couldn’t get through a conversation without someone delegating their thinking to Chat GPT. We did what humans have done for millenia; gathered around the ingenuity of our new toy, where once flames illuminated our faces, now it is the pale glow of the iphone. But, as the Chat GPT glee subsided and serious questions are asked about the future of Artificial Intelligence, we need to reframe how we relate to this technology. I worked for a number of years directing a digital human driven by artificial intelligence, and learnt AI is most effective when it amplifies creativity and encourages critical thought.

As Chat GPT has torn through the zeitgeist I’ve been struck with how focused we have been on outsourcing our thinking rather than seeing artificial intelligence as an apparatus to think around. We can all write bios - what we have trouble with is working out the ideological issues that divide our societies. In my experience AI does better clarifying thought than undertaking it and I wonder whether it can help us work through some of our complex issues.

I am currently working with The Rhodes Trust as it marks the 120th anniversary of the Rhodes Scholarship. I am looking through our collection of portraits -  some are deeply inspiring but others are trickier to digest - and in the context of Rhodes Must Fall campaign, harder to look at. My mind has gone back to work I did in 2017 with the band U2 on their Joshua Tree tour, curating digital content that celebrated women from the past - centring Herstory rather than History. Every show featured a different selection of historic women local to the area. These women existed on huge screens behind the band, giants in the stadiums.

For me the most interesting part of the job was the relationality between the figures, considering who would go where and when. For example, during the guitar solo, Edge was dwarfed by Sister Rosetta Tharpe, a genius guitar player, the often overlooked Godmother of rock and roll. I liked to imagine she was whispering in his ear - but what would she whisper? What would they all say? In the development of the show, I became obsessed with this idea of conversation between the women. As the visuals changed and the figures morphed into each other, I imagined the connections between them and this became the curatorial backbone of the piece. Taking their histories into account, these figures wouldn’t have all agreed with each other, there would have been raised voices, some tears, and that was ok, that was exciting.

This work felt pertinent to the Rhodes Trust because in both cases I was drawn to the spaces for overlap between figures, somehow seeing the Rhodes portraits together, and enabling space for dialogue makes the more complicated ones easier to look at. I started thinking about a digital collection of such works and leaving big blank gaps between, hoping the viewer can fill them in, and make sense of this contemporary moment.

My colleague hears my interest and like so many others in the last few months, sees a vacancy that Chat GPT could fill. So the idea became that alongside the portraits we would use Artificial Intelligence to reimagine conversations between the sitters. Here the algorithm will survey the internet for information on the portrait’s sitter, and use the data to type a conversation between the protagonists. It is fictional, yet it is based in some version of reality. We asked Chat GPT to deliver a conversation between Nelson Mandela and Cecil John Rhodes. Our human direction was vital, we thought about it, and added “be creative” at the end. This was what came back:

Mandela: Greetings Mr. Rhodes. I wonder what brought you to this land of Africa.

Rhodes: Hello Mr. Mandela. I came here to pursue my dreams of wealth and empire. And you, what is your story?

Mandela: I was born here, in this beautiful continent. I fought for freedom and justice for my people. I spent many years in prison for my cause.

Rhodes: I see. We have very different views and values, Mr. Mandela. I wonder if we can ever find common ground.

Mandela: Perhaps we can Mr. Rhodes. Perhaps we can learn from one another and respect our differences. Afterall we are both human beings.

Rhodes: Maybe you are right Mr. Mandela, maybe there is more to life than power and glory. Maybe there is something to be said for peace and harmony.

Mandela: I am glad to hear that Mr. Rhodes. I hope you will join me in working for a better future for all the people of this land.

Rhodes: I hope so too Mr. Mandela, I hope so too.

We were giddy for a moment as we read it, as most people are when they first use Chat GPT. I was struck by the sensitivity with which the information was interpreted. The softly reflective sign off “I hope so too, Mr Mandela, I hope so too” deftly recognises that Rhodes would be impacted by the power of Nelson Mandela, but that there was no evidence to suggest his actions would be changed. It speaks simultaneously to the wealth of data we have on the orator that Mandela was and the actions that Rhodes undertook.

The idea was that this spectacle would encourage dialogue about and between our portraits, establishing them not as fixed entities of reverence, but as springboards for contemporary conversation around the past. Artificial Intelligence would find connections we might overlook and can be objective in a way that we can never be. It would suggest moments of conversation, conflict, and ideally a way to sit with an uncomfortable past in an authentic way, without the human spin of shame.

But then, after thinking for a moment, (for I am not as quick as Chat GPT), I realized the human spin of shame is maybe exactly what is needed in situations such as these. After the initial excitement died down, we wondered whether men like Rhodes should be reanimated, and whether letting Chat GPT take the voice of those who have been side-lined historically was just another way to silence, even posthumously. We would, like so many before us, use technology to bulldoze; we were “moving fast and breaking things” - the idiom that so many in big tech hold dear, in pursuit of progress - something I had promised, on entering the industry, I would never do. Chat GPT comes back almost instantly with something so whole and coherent we can’t believe it. It is partly the speed that impresses us. Unlike Chat GPT I wanted to slow this project down, I wanted to think about this over coffee, change my mind several times, and then back again, talk to my friends about it…

What I am saying is the value lies in the moments of pause. Our slower pace, our ability to reconsider is a human superpower. The gaps enable us to think deeply, to be changed by each other. It is all about the days spent looking out of a window and wondering what if. That doesn’t mean that there isn't space in our lives for AI, quite the opposite. I think we should reframe it, asking not what can AI do for us, but what we can do with it. We should be moved by it, think around it, see it as a generator, an instigator, something to test our own human feelings by - and if used in that way, if GPT Mandela was to say to me “I hope you will join me in using artificial intelligence to work for a better future”, - I will (for the first time) echo Rhodes' sentiment and reply: “I hope so too, Mr Mandela, I hope so too”.

Alice Wroe is XR Lead for Atlantic Institute where she serves a global fellowship of leaders committed to accelerating the eradication of global inequities. She explores how emerging technology can further social equity, developing and commissioning experiences that interrogate what it means to be human when digital.