³Ô¹ÏÍøÕ¾

Teaching ethics of data science through immersive video

This spring, students in a data science class faced a challenging dilemma: Should they launch a facial recognition application that let unemployed people access their benefits – even though it often locked out women and people of color? And what would it feel like to be one of those users, facing eviction the next day because she couldn’t get her unemployment benefits?

As society ponders the dangers and unknowns of generative AI, Cornell Provost Fellow is giving her statistics students a first-hand look at the implications of their decisions.

She has created an immersive video, “,” designed to increase viewers’ empathy and awareness of unintended consequences. Karns introduced it to her Integrated Ethics in Data Science class at the end of the spring semester.

“Statistics students, who build large-scale predictive models, don’t have a good sense of what happens and who’s affected when you have a bad model,” said Karns, senior lecturer at the ILR School and at Cornell Bowers Computing and Information Science. “I’ve always had a problem with that disconnect.”

Credit: Serge Petchenyi, Center for Teaching Innovation, Cornell University

Students in the Integrated Data Science class watching the interactive video Nobody’s Fault.

In “Nobody’s Fault,” students experience what it’s like to be a data scientist dealing with a moral conflict. The video stops from time to time, asking viewers how they would handle the tricky situations being depicted. As they make decisions, the plot shifts, and they see the consequences unfold – and how they affect an unemployed woman who can’t get the facial recognition application to work.

After a series of unhappy outcomes, the scene rewinds, better choices are offered, and students see how things could have been different for the woman seeking her benefits.

“The video gave us real-world experience with ethical dilemmas,” said Britt Snider, M.I.L.R. ’24. “It enhanced our learning of the subject by showing us in real time the consequences of our decisions – and how something as seemingly innocuous as a few percentage points could cause such a large consequence to society overall.”

Added Karns: “Data science has no regulatory agency, no qualifying exam, no degree requirements, no single professional code. That’s why it’s all the more important that people creating the technology have a moral compass.”

Karns developed the script, drawn from three cases. Then she adapted it, working with Martin Percy, a London-based director of interactive video who has won Webby and Clio awards. Percy participated via Zoom in each phase of the production, and co-directed the film with Karns.

The Center for Teaching Innovation (CTI) filmed the video, led by instructional designers Ksenia Ionova and Amy Cheatle. The project was supported by an from CTI.

Karns hopes to produce two or three more videos; she also plans to create a facilitated experience to accompany them, so that they can be used by classes and instructors across Cornell.

Karns teaches “what personal decision-making looks like and why one’s virtues are ultimately what matter. Humans – data scientists – decide how these systems roll out,” she said. “At no point can we say, ‘It’s just the system, it’s just technology doing its thing.’ It’s always a human choice. And I want students to feel a bit more responsible for that.”

For Joyce Gorospe, M.I.L.R. ’24, one of the biggest takeaways from the course was that the responsibility for ethics comes from every individual, not just from the top.

“It’s one thing to read about ethical issues; it’s another to see them enacted by real humans and see the emotions behind their reactions,” Gorospe said. “It was easy for us to put ourselves in their shoes and imagine ourselves in situations where there may or may not be a clear answer, depending on how aware you are of these issues.”

Sandi Mulconry is a freelance writer and editor.

/Public Release. View in full .