By Katrina Dix
Generative artificial intelligence like ChatGPT has changed learning, with most students thinking of it as much knowledgeable than they are. But two sixth-grade math classes in Florida got a chance to flip that script as part of a study 91°µÍř professor Jinhee Kim, Ph.D., co-authored last year.
“Usually, we think about AI as a tutor shepherding students’ learning process,” said Dr. Kim, who works within the Instructional Design and Technology program in the Darden College of Education and Professional Studies. “But I wanted to try something else, so I collaborated with the University of Florida to design AI that the students could teach, instead — a less-knowledgeable peer.”
This experiment was just one aspect of a broad research agenda that has carried Dr. Kim from her initial work with robots (in the form of puppies) in grad school, and into the growing role of AI in teaching and learning.
Now, she and graduate assistant Kay “Rita” Detrick have worked to create and test an emotional support AI for stressed-out students; measure students’ physiological responses while interacting with AI; and evaluate the ways students with different levels of academic accomplishment and AI literacy worked with AI, among other research throughout the U.S. and Asia.
For the math class study, Dr. Kim and her colleagues designed a “teachable agent” that asked students for help with a set of math problems. Observers in the classroom reported excitement and deep engagement with the tool, with some students asking if they could use it at home.
In surveys, students said basic concepts became clearer as they explained them to the AI, and they felt more responsibility and ownership for the learning process when they were in charge of teaching someone who asked for more details on how to find answers in a friendly, unintimidating way.
Dr. Kim describes her research area as “human-AI interaction in education,” designed to analyze the various ways students work with AI as well as create technology based on the best practices that are starting to emerge.
“More and more, it takes me back to the fundamental questions,” Dr. Kim said. “The more I conduct research in the field of AI in education, the more curious I become about what kind of education this country and the international community aspire to achieve.”
She and Detrick, who primarily studies educational psychology, both emphasize the importance of teaching AI literacy and critical thinking alongside technology.
“Right now, it’s still very reactionary, how AI is being applied,” Detrick said. “A lot of educators don’t have and aren’t getting training on how to use it or how to integrate it, and so it’s being stuck in wherever it seems to make sense.”
Their research tries to assess how and why AI is being used in the classroom and then build pedagogical scaffolding into the process.
In one recent project, published this summer, the pair evaluated graduate students for whom English is a second language as they used a custom generative AI for academic writing. The students expected a lot from their chat buddies, from analyzing a variety of sources of information, evaluating the relevance of generated content, and creating unique and personalized content; but they also recognized their own need for logical reasoning, as well as the ability to question łŇ±đ˛Ô´ˇ±ő’s assumptions and conclusions.
“Obviously, we have to really highlight the importance of designing an ethical AI, but at the same time, we also have to highlight the importance of AI literacy among the students,” Dr. Kim said. “We, humans, are going to be the leading users of AI, not have AI leading us. The user should be critical enough to know when and how to utilize AI.”
Detrick added, “It comes down to whether or not a student just copies and pastes whatever AI spits out, or whether they thoughtfully consider it, critique it, revise it, and figure out the good bits they can use out of it and how to build on it.”
When it comes to how to create that literacy, the researchers think AI can help there, too. Dr. Kim has started work on a virtual classroom for future teachers, where AI creates a simulation and observes pre-service teachers’ classroom management and instruction skills.
“Developers who want to create these types of AI tools for the education space are going to have to co-create these things with educators and students and researchers,” Detrick said. “It points to this big collaborative space, an ecosystem that has to be developed.”
The researchers hope to contribute to that ecosystem and ensure it will be theoretically grounded, with human-centered AI systems that enhance learning.
“Long story short, to me, educational goals should be well-established first, so that we can design educationally relevant AI technology,” Dr. Kim said. “When it comes to AI in education, I think we really need to have a good balance and interplay between technology and the noble goals of education.”