Display Accessibility Tools

Accessibility Tools

Grayscale

Highlight Links

Change Contrast

Increase Text Size

Increase Letter Spacing

Readability Bar

Dyslexia Friendly Font

Increase Cursor Size

When data isn't neutral

MSU students collaborate with Zekelman Holocaust Center to confront the hidden ethical history of STEM

exterior photo of the Zekelman Holocaust Center
Exterior photo of the Zekelman Holocaust Center
Courtesy photo: Zekalman Halocaust Center

Victor Piercey hadn’t thought much about the morality of actuary science. A tour of the Zekelman Holocaust Center changed everything.

The Michigan State University alum, now a professor at Ferris State University, was stunned when his tour guide mentioned that Nazi concentration camps were insured. As a teacher in Ferris State’s actuary science program, that fact was a gut punch.

Listen to article

“That immediately sent my imagination running,” Piercey said. “I thought of all the other ways STEM professionals played a role in this atrocity that was carried out on an industrial scale.”

Last semester, Piercey teamed up with MSU’s Computational Mathematics, Science and Engineering, or CMSE, department and the Computing Education Research Lab, or CERL, to help their students connect ethics to science, technology, engineering and mathematic fields.

Together, they researched STEM professionals and the role they played in Nazi Germany’s mass genocide. Then, they partnered with the Zekelman Holocaust Center to use modern ethical guidelines for a relevant STEM profession to examine their subject’s actions. For example, an actuary at that time was tasked with assessing the financial risk of a concentration camp.

The project is part of a growing effort to integrate ethics into data science and other CMSE courses. Danny Caballero, Lappan-Phillips Endowed Professor of Math and Science Education and a CERL principal investigator, wants students to understand that data sets, artificial intelligence (AI) and computer science all have the capacity to harm people, and that the way they use science and data could have major consequences.

“I think the key thing is informing students about what’s going on with these tools and how they can be used,” Caballero said.

Rachel Frisbie, a fixed-term assistant professor in CMSE and the department’s graduate program director, and Rachel Roca, a CERL Ph.D. candidate, were collaborating on another project when they met Piercey. Together, the three hatched a plan to their students together for a deep dive into ethics.

MSU’s Honors College and Ferris’s Innovation Accelerator Grant helped cover costs to ensure students paid nothing out of pocket for the project.

Along with Emily Bolger, a CERL Ph.D., they arranged for students to watch a presentation from the Shoah Foundation, a visual archive of video testimonies by genocide survivors to gain perspective on the tragedy. MSU and Ferris students toured the Zekelman Holocaust Center, where they also heard from a Holocaust survivor’s descendant.

museum exhibit showing large black and white photgraphs with dramatic lighting
Exhibit inside the Zekelman Holocaust Center
Courtesy photo:  Zekelman Holocaust Center

Then, each student picked and researched a STEM professional to study their role during the Holocaust. The project culminated when the students presented their research during a poster session at the semester’s end.

Students across disciplines were surprised to find how the atrocities of the Holocaust were connected to their own fields. They struggled to reconcile how positive research outcomes, such as vaccines and genetics, were generated from morally reprehensible decisions.

“Students grappled hard with the moral implications,” Bolger said. “How can something be two things at once?”

Roots in murky waters

The world of data sets has a dark past. Some of the processes and equations were shaped nearly a century ago by eugenicists — a pseudoscientific theory based in racist and ableist beliefs and practices aimed at “improving” the human species. Even significance testing was first developed to identify racial differences. While data is often viewed as neutral, the scientists who develop the data sets can impart their own prejudice.

“A lot of this work comes from eugenicists, and we don’t talk about that,” Roca said. “We teach the process and the equation and what it does, but we don’t teach the motivation behind it, and who developed it.”

One of the most famous examples is a widely-used dataset known as the iris dataset. Developed by R.A. Fisher in 1936, the dataset examines three subspecies of iris flowers and how they pass traits through their lineages. This dataset is still used in teaching and machine learning today — but it was first published in the journal Annals of Eugenics.

As Roca toured the Zekelman Center, she spotted a picture from a science textbook teaching about racial superiority by using flowers, just like the iris dataset.

“Flowers are not always flowers,” Roca said.

Data may be neutral but the person who collects it may not. Every piece of information comes from a specific context and environment, with social and political implications. The data is used with a biased intent. For example, charts and graphs that include male and female statistics leave out people outside of the gender binary.

Roca encourages students to look at data and ask themselves, who’s being elevated? Who’s being left out?

museum exhibit with guests viewing panels of text
Inside the Zekelman Holocaust Center.
Courtesy photo:  Zekelman Holocaust Center.

The class also drives students to consider real-world implications of models and computations, Bolger said. You might not know the implications of a decision made within a silo until it’s too late. The Holocaust project was a touchpoint for students to connect history to real decisions made by scientists and STEM professionals. The course puts the students in the position of scientists almost a century earlier and asks them to handle ethical questions during a challenging political period.

“We want our students to grapple with what it means to become a practitioner of data out in the world,” Frisbie said. “We want them to engage in these conversations so that they can develop this sense for themselves. We can empower them to develop an internal compass.”

Preparing for moral dilemmas

Ethical dilemmas may come fast and furious. As data science students enter the workforce, they will face the rise of AI. The question before them is how to move forward. For example, some facial recognition technology has proven to have a racial bias. Another technology claims to screen ideal job candidates based on facial characteristics.

Students entering the corporate world will face pressure to adopt generative AI in their day-to-day work, often without time to question its accuracy, bias or environmental impact.

Even casual social media users are subjected to algorithms and data collections that impact what they see when they scroll YouTube shorts or TikTok.

“We’re not taking space to talk about the nuance and help students understand these issues,” Frisbie said. “You might have to deal with this tool, but you should also know what’s going on behind the scenes so that you can make educated decisions.”

Roca hopes the lessons students learned in their class will help them stop and think. They might not be able to solve the world’s problems on their own, but they can ask questions. They can start discussions and encourage their colleagues to look below the surface.

The world of STEM will always come with ethical dilemmas. MSU faculty hope when students encounter them, they remember science has never existed in a vacuum. While looking away is also a decision, confronting these difficult situations will benefit contemporary use and future generations.