Display Accessibility Tools

Accessibility Tools

Grayscale

Highlight Links

Change Contrast

Increase Text Size

Increase Letter Spacing

Readability Bar

Dyslexia Friendly Font

Increase Cursor Size

The new study buddy: AI is becoming a tutor for some College of Natural Science students

From ChatGPT to a pilot AI tutoring program, more students and faculty are embracing what they say is the future of student learning

A new study buddy: AI is becoming a tutor for more MSU students

From ChatGPT to a pilot AI tutoring program, more students and faculty are embracing what they say is the future of student learning

By Bethany Mauger

Students in front of computers in a classroom with screens that say "Prompt AI"
In coding classes, faculty often use ChatGPT and generative AI and teach students how to prompt AI to generate the correct code, rather than asking them to write the code themselves. Photo credit: Finn Gomez

Listen to article

When Michele Weston is stuck on a statistics problem, she opens a new ChatGPT conversation.

The Michigan State University plant biology graduate student crafts a prompt asking the chatbot to explain the problem to her. If she still doesn’t understand, she asks a follow-up question. Her goal isn’t to get ChatGPT to spit out the right answer as quickly as possible. The point is to understand the course material. 

Before generative artificial intelligence, or AI, entered the scene, Weston’s only option for extra help would have been visiting university help rooms or making an appointment during her professor’s office hours. Now, Weston and other students have instant access to help with their homework and studies.

“I call ChatGPT a tutor because it’s an ongoing conversation,” Weston said. “My intention is to truly understand what I’m doing, and those are the types of questions I ask.”

The rise of generative AI over the last few years has forced students and faculty alike to grapple with how this technology will, for better or worse, impact education.  With the release of ChatGPT in 2022, students and faculty alike were handed a new tool that could forever change our classrooms, our study habits and our approach to learning.

Now, some students are using generative AI for help with homework and studying for tests. They’re reaching out to ChatGPT when a concept stumps them in the middle of the night or over the weekend. And an MSU pilot program is testing a tutoring AI program called Khanmigo designed to coach students in math and science without giving them the answers.

Stephen Thomas, digital curriculum coordinator for the MSU College of Natural Science, compared the introduction of generative AI to the early days of the internet. At that time, faculty worried that students could easily search for information on Google or Wikipedia instead of memorizing it for themselves. This shift led to a greater focus on critical thinking and synthesizing information.

“Now the question is, what does expertise look like when you have something synthesizing and producing information for you?” Thomas said. “It might look more like evaluation, being able to determine authority and performing those types of cognitive tasks.”

An MSU professor stands in a classroom next to a projector
Assistant Professor Nathan Haut teaches his Computational Modeling and Data Analysis class. Photo credit: Finn Gomez

Solving the Two Sigma Problem

Some fear AI is just a new way for students to cheat. Instead of writing their own research papers, students could write a prompt and ask ChatGPT to write it for them. Solving a math problem could be reduced to plugging the problem into a chatbot and generating the answer.

Thomas said AI isn’t without its unknowns or dark corners. Still, he said AI has the potential to solve what’s known as the Two Sigma Problem. In 1984, educational psychologist Benjamin Bloom argued in a paper that students receive better outcomes with a one-on-one tutor than in a classroom. That’s impossible to do on a large scale for most institutions – at least until AI entered the scene.

Last spring, the Evidence Driven Learning Innovation, or EDLI, team invited a pilot group of 80 students to try an AI tutor for a semester. They gave each student an account with Khan Academy’s AI tutor, Khanmigo, for math and science.

The program doesn’t automatically give answers but takes a Socratic approach, asking students questions and walking them through the problem-solving process. All along, it checks in to make sure students understands what they’re doing.

The results were impressive. In a survey, students said Khanmigo helped them understand unfamiliar concepts and gave them step-by-step guidance to solve problems without giving them the answers. They said they performed better on assessments than they would have without the program.

As a result, MSU is offering Khanmigo to a larger pilot group of about 800 students. Faculty and staff are closely monitoring how it’s being used and the results as they determine whether to widen the access to more students.

“As a land grant institution, our mission is to have a large impact on everyday people,” Thomas said. “That’s the promise of what it looks like for students as we dip our toe into having a one-on-one AI tutor.”

Jane Zimmerman, a specialist in the Department of Mathematics, teaches Math 103 A and B, a two-part course with small class sizes that teaches algebra at a slower pace. Students in her class often think they’re not good at math and are embarrassed to see a tutor or visit office hours, she said.

Zimmerman decided to offer her students Khanmigo last fall as a low-risk way for them to ask questions and clarify difficult math problems.

“A lot of the students placed in this course are experiencing some level of imposter syndrome,” Zimmerman said. “Their math self-images are in the toilet. When you’re in that situation, reaching out for help is very difficult.”

While Zimmerman doesn’t ban students from using ChatGPT, she’s clear that using AI as a homework shortcut won’t help them. She grades students based on their mastery, with no points for homework or attendance. Her assessments are all in person and require students to show their work and explain their reasoning. If they don’t know the material, their test scores will reflect it.

A professor teaches in a classroom.
Assistant Professor Nathan Haut teaches his Computational Modeling and Data Analysis class. Photo credit: Finn Gomez

Studying with ChatGPT

Students who aren’t in the Khanmigo pilot are also looking to AI as a stand-in tutor. Abby McGinnis, a senior supply chain major, said she’s found herself using ChatGPT multiple times a day just in the last several months. She uses it to check for coding errors in her computational math classes or to spruce up her writing in management classes.

Using AI saves her the time she otherwise would have spent going to office hours or scouring the internet for help.

Mark Endicott, a fourth-year data science major, keeps ChatGPT pinned in his browser. Throughout his day, he asks the chatbot to confirm facts, generate spreadsheets and check his writing in anything from emails to reports.

Sometimes, the students wonder if they’re becoming too reliant on AI. Jisha Goyal, a third-year computational data science major, has also noticed a decline in the collaborative schoolwork she used to do. Before she used ChatGPT, she and her friends worked together, asking each other questions when they didn’t understand an assignment. Now, it’s easier to just ask AI, she said.

 Still, McGinnis said students can’t afford to ignore AI.

“I think it’s such a good tool,” she said. “If it’s something we’re going to have in our jobs and in our future, we might as well learn how to use it.”

ChatGPT can also be a godsend to students studying late into the night. Assistant Professor Nathan Haut said he used to get desperate emails at 2 a.m. from students panicking over an assignment in his Computational Modeling and Data Analysis class. Now, they can get help any time, even when their professors are sleeping.

Ethical questions

 For all its benefits, ChatGPT doesn’t come without its concerns. Thomas said administrators are grappling with what it means to not have control over the responses AI tutors give students. A chatbot might “hallucinate,” referencing a case study that doesn’t exist or completely making up a paper to support a student’s research. There’s also the possibility that it could say something off the wall that has nothing to do with the prompt.

“I’m uncomfortable that it might give someone an incorrect fact at any point,” Thomas said. “At the same time, how many times have I heard something incorrect without AI?”

Weston grapples with ethics as a student. She’s always conscious of the boundary between plagiarism and original thoughts when she uses AI to help edit her writing assignments, especially considering AI detection software used by some faculty.  She keeps a log of her chat histories in case there’s ever a question.

“If I’m using it to edit my writing, will that cause it to be flagged as if I used it to write the piece itself?” Weston said. “And can I defend myself if that were the case?

Students must grapple with bias in the information churned out by ChatGPT, said Dirk Colbry, senior specialist in the Computational Mathematics, Science and Engineering Department. Open AI like ChatGPT is trained on biased information. When students ask questions on open AI, their answers will contain those same presuppositions and assumptions.

Colbry is also concerned with the energy used to power the computers behind ChatGPT. While training open AI uses more energy than asking questions, the cost still adds up when it’s being used by millions of people.

These problems don’t necessarily mean no one should use open AI, but they also shouldn’t be ignored.

“The conversation has to be had,” Colbry said. “We need that mindfulness so that every student knows the cost. Then when they get into the workforce, they can be part of the solution instead of sweeping it under the rug.”