This is the question moderator Rachael Merritt (Michigan & Exeter 2022) asked, turning to the assembled panel of activists, human rights researchers, and social entrepreneurs beside her on the Public Interest Technology panel at the Rhodes Forum on Technology & Society. Stephen Damianos (Pennsylvania & Balliol 2020) nodded, then paused.
“I don’t think in terms of optimism or pessimism,” said Damianos, Director of Technologies and Human Rights at Perseus Strategies. “I think the reality is that the technology is here and is evolving at an exponential pace. The key question is how do we maximise the potential good and minimise the potential harms. Because there will be good and there will be bad.”
The panel on “Public Interest Technology” was a much-needed dose of thoughtful consideration during a Tech & Society forum that was - for the most part - looking forward to the exciting potentials offered by advancements in generative AI, robotics, and other emerging innovations. During the panel, speakers turned a (constructively) critical lens on the way new technologies are built and sometimes used against the public.
For instance, consider automated tools like chatbots and robo-calls that now manage customer service at the vast majority of companies. While these might bring cost efficiencies and maybe speed up response times for the average customer, Damianos argued that eliminating humans can often foreclose opportunities for empathy, justice, or special assistance - especially for customers (or citizens) needing special accommodations or experiencing edge cases.
“When entire offices, agencies, or products are digitised, you erase humans. If there’s not a physical person to talk to, a lot is lost. And one of those things is accountability,” he said. “I don’t know if anyone here has tried to get a visa appointment or a passport renewal recently - you probably didn’t talk to a person. If there’s no human to talk to, or if a government has subcontracted to a company that is based between five different countries… It’s very hard to even trace it, or find justice.”
The conversation highlighted the tensions created by innovation, efficiency, and automation — which can sometimes benefit (at least parts of) the public, but usually also centralise power and control.
“The panel is about ‘public interest technology,’ which almost implies the existence of ‘private interest technology.’” said Nasser Eledroos, an Atlantic Fellow and Policy Counsel at Color of Change. “We don’t often say it that way, but it’s the default of technology as we know it today.”
Adam Parr, an entrepreneur and investor, pointed out the structural environment of capitalism makes truly public-interest technology difficult. “It’s very hard to build a business” he observed. “You have to make rules. You can’t just expect businesses to do the right thing, because they’ll do what’s needed to survive and to grow.”
Anna Brailsford, CEO of Code First Girls, agreed, noting that entrepreneurs, especially, have to be relentlessly focused on their own growth. “Startups bend over backwards for clients,” she said. “When you’re starting out, you’ll do things that you won’t do down the line.”
How then, can we ensure that technology is not only used to advance private interests? To start, the panel agreed it was essential to spread technical talent throughout not only profit-seeking firms, but within government, civil society, NGOs, and other organisations oriented towards public service. Parr noted that plenty of technologies throughout history have caused social, political or environmental harms - and argued that collective institutions, not technologies, were usually the best checks and balances.
Brailsford argued that educating more technologists and increasing diversity within the ranks of designers and developers is another prerequisite to ensure innovations are public-minded. She pointed to a partnership between Code First Girls and GCHQ — raising eyebrows among other panellists — to argue that it was more important to ensure that even the most controversial technologists are built by teams representative of the public.
Nasser argued that it wasn’t enough to have public watchdogs or mere representation, but installing leaders to pursue the public interest at the top of governments and organisations. “The pipeline from nonprofits to government or nonprofits to corporations is exactly what we need to encourage,” he said. “It’s one thing to have that adversarial, knowledge-based approach to understanding technology - but that isn’t power itself. You need to give them positions to produce meaningful change in the public interest.”