Technology and equality: Ruha Benjamin, 91Թ experts to tackle thorny issues at first Schwartz Reisman event
Technology continues to evolve at breakneck speed, promising solutions for some of the world’s biggest problems.
But rapid innovation can also cause issues of its own, from aggravating existing socio-economic divides to creating products that discriminate against certain groups – like facial recognition technologies that don’t recognize users with darker skin tones.
Experts from across the 91Թ – from bioethicists to artists and artificial intelligence researchers – are coming together on June 11 to explore these and other problems at Fairness and Equity in a Digital Age, the newly created Schwartz Reisman Institute for Technology and Society’s inaugural symposium.
The event’s keynote address will be delivered by Ruha Benjamin, an associate professor of African American studies at Princeton University whose research explores the social dimensions of science, technology and medicine.
“91Թ experts are exploring the intersections between society and technology from a wide range of disciplines across the university,” said Vivek Goel, 91Թ's vice-president of research and innovation.
“This symposium brings together those important conversations into a public forum – setting the stage for the incredible range of interactions and scholarship that we expect to take place at the Schwartz Reisman Institute for Technology and Society.”
The institute is supported by a historic, $100-million gift from Gerald Schwartz and Heather Reisman. Starting with next week’s inaugural event, the institute will create a forum for dynamic public engagement through a steady calendar of activities that includes high-profile speaker events, workshops seminars and a major international conference – all while facilitating cross-disciplinary research and collaboration in fields ranging from sciences to humanities.
Benjamin applauded the new 91Թ institute for tackling big issues head-on at its first symposium.
“I love the fact that they're prioritizing the society-technology relationship right from the get-go rather than an afterthought,” she said.
91Թ News recently spoke with Benjamin about the upcoming event and her views on equity and innovation.
What will you be speaking about at the event?
I'm going to be looking at the relationship between innovation and inequity. One of the main points I want us to get across is that, if we do business as usual and don't take the social inequalities that exist seriously on par with the technical advancements, we are going to by default reproduce the existing inequalities by virtue of engaging in innovation.
One of the examples that I tend to draw on is looking at Silicon Valley – the geography, the demographics. One of the things that's happened there in the last decade or so is the housing crisis there has really escalated in conjunction with the development of the tech industry. There's one-in-three school children who are experiencing housing insecurity, which is a kind of euphemism for actually being homeless, and it's in relation to the growth of an industry that promises that it's investing in a better future, but that future is stratified.
One of the things I want those who are involved in all the innovation sectors to take on as part of their responsibility is to think seriously – to collaborate with people who have expertise in the social sciences and the humanities, to be able to integrate that body of knowledge into their work.
How are inequities built into the design of technologies?
One of the things I'm trying to draw out through my work, in conversation with many other scholars, is looking at the social inputs to technology – so starting the story earlier and thinking about the social order, the demographic of people who are designing the technologies, the economic structure that incentivizes something and really shapes what we then think up as an inevitable development.
One of the things I'm trying to de-naturalize is this idea that technology grows on trees. We have to look at what we are embedding in these technologies.
What are some examples of how biases are built into technological design?
One example many people will have come across on YouTube is soap dispensers that don't recognize darker skin. It's a simple automation that, because of its design through infrared technologies, the light doesn't reflect off darker skin, so the soap doesn't come out. On one hand, it seems like a superficial, inconsequential form of discrimination. If you go on YouTube and look up "racist soap dispensers," even the two individuals – one light-skinned, one dark-skinned – who are demonstrating it are giggling through the video, because it seems inconsequential.
But when we then zoom out a bit and think about how those same design decisions get amplified, we think about automated systems that are deciding who should receive parole in the judicial system. Here you have a much more complex form of automation that's saying low-risk or high-risk. Based on that, it judges either releasing someone or giving them a longer sentence.
In that case, race is never explicitly part of the design, but all of the questions that are eliciting the riskiness of a person – whether their neighbourhood has a high crime rate, whether people in their family are unemployed, whether they've ever been arrested before – are shaped by racist practices and processes before they ever get to the point where the algorithm is deciding if they're risky.
What we found through is that Black individuals are much more likely to be coded as higher risk than white individuals. We find over time that the white individuals were more likely to re-commit a crime than the Black individuals who were coded "higher risk."
One of the real takeaways I'm hoping to discuss with folks is the way that public policy decisions are happening in the private sector. The people most affected have no say, and are not voting for individuals who are making these design decisions that are having these real-life impacts. I think we have to become more transparent and have a more democratic process in terms of which technologies we sanction.
Why do you think it's important to have public events and discussions to talk about issues of equity and innovation?
These conversations are important because it affects everyone, and so few people have a genuine say in the technologies and their advancement, which are promised as a universal public good.
The greater variety of individuals that are part of the process – but also moving beyond conversation – allows us to really think about the day-to-day processes in terms of who has power to actually say, "Let's go ahead and design that" or "Let's not do that."
Those mechanisms and that participation has to be honestly evaluated and transformed if we are going to do more than just talk.
What is education’s role in the conversation around equity and innovation?
How ever we want things to be in the future, we plant a seed in education. Most likely, when we inherit the educational practices from the past and we keep them going, education too often is the main mechanism in which we reproduce inequality.
Because of the way we design the educational system itself, it too often reproduces inequality. But it also means if we were to be more thoughtful, diligent and visionary about how we educate the next generation, that it can also be the mechanism for greater equity and justice.
Specifically with respect to technology, one of the key things we have to do in terms of the way we structure education is not to limit conversations about equity and inclusion to particular disciplines. It should not just be the purview of the social sciences or ethics, and cornered away in these different arenas. The people who are being trained to actually design technologies also need a tool kit in which they can be thinking about these questions of equity and inclusion.
One of the things we are finding in the last couple years is many tech workers are raising their voice to question the kinds of things the companies and organizations they work in are producing. They're walking out, they're petitioning, they are naming themselves conscientious objectors. There's a whole movement within the tech industry where there's a growing consciousness about these issues – and, if I had to guess, I would think these individuals were exposed to these issues and these questions as college students through the course of their training. I don't think it's an epiphany that happens on the job. These things have been percolating.