Panel discussion at AI and Human Innovation Showcase explores ethical and practical problems of AI at universities

Who’s afraid of AI?
Not Emily Cherry Oliver, professor of theatre, co-director of UND’s AI and Human Innovation Initiative and one of the organizers of UND’s first AI and Human Innovation Showcase.
“Even if you’re afraid of AI, we wanted this event to showcase how important the human side of AI is,” said Cherry Oliver before the event. “There are a lot of interesting things happening with AI at UND, and we wanted to give people a platform to show us what they’re doing. I think there’s a lot we can learn from each other.”
On Friday, Feb. 28, in the Memorial Union, UND students, faculty and staff to had the chance to explore artificial intelligence’s ethical limitations while also providing a platform for them to show their work and connect with potential collaborators around campus.
The daylong event featured a keynote speech, panel discussion featuring UND faculty, students and staff, and presentations of work involving AI happening at UND.
The advantages (and pitfalls) of artificial intelligence
After a virtual keynote speech from Sarah Newman, director of Art and Education at Harvard’s metaLAB, a panel discussion featuring UND staff, faculty and students was held.
In her talk, Newman emphasized the ethical and responsible use of AI. The panelists, in turn, discussed how they’re applying AI to their work and education, as well as AI’s newfound popularity among students and the pitfalls of over-relying on technology.
The panel was moderated by Anna Kinney, coordinator of the U Writing Program and co-director of the AI and Human Innovation Initiative. It featured Emily Wirkus, assistant professor of theatre; Ariann Rousu, 3D tech and Native heritage artist at the Computational Research Center; Shelbie Witte, Dean and professor at the College of Education & Human Development; and Jonathan Wirkkala, a master’s student in mathematics.
“The human interaction part of working with AI is the most valuable part,” said Rousu, who is the artist and designer behind the Native Dancer Project. “Without direction from human prompts, AI won’t do much.
“The problem that I’ve seen is what the AI produces is repetitive, so it’s important that I am able to see that not be repetitive when I communicate with it, or to reorganize my thoughts or the data set I’m trying to use.”
Rousu added that because her work on the Native Dancer Project — a VR game that takes Powwows to the metaverse — requires fidelity to a particular tradition and cultural environment, she must be vigilant and thorough while using AI.
Cultural and technological advisory boards have assisted Rousu throughout the project, and she says that the introduction of AI has led to many interesting conversations.
These often involve “how culture and technology overlap, and it’s been really interesting to see how to navigate these values,” she said. “AI can be a really cool tool, but we also have to step back and realize that sometimes, this isn’t the best step for us because the AI might not be able to recognize culturally important things that need to happen.”
Wirkkala, a graduate student in mathematics, said he also approaches AI with caution.
“AI tends to be the loudest voice in the room when its brought in,” he said. “There’s almost an appeal to authority, because AI is parsed through the entire internet, while I’ve read two Wikipedia articles.”
Which is why he believes introducing arts and humanities into conversations about AI is important.

AI making space for creativity
“It allows us to question things. You can question what it is, why it’s good or bad,” he said. “When you ask whether AI is doing a good or bad job at creating something, it helps you remain skeptical of it — which, I think, will prevent it from being the loudest voice in the room when it gets invoked.”
Emily Wirkus uses AI in her theatre instruction for this very reason: to help students understand the importance of their humanity and individuality.
“As an actor, no matter what role you’re playing, you are never not yourself,” she said. “That’s your biggest asset.”
AI, she said, struggles with context and subtext — concepts that typically come naturally to humans when engaging with works of art.
Then again, using AI can be a powerful tool when learning something like a monologue. “For some Shakespeare monologues, it was able to give constructive feedback really quickly,” Wirkus added. “By taking a lot of the nitty gritty roadblocks from the beginner process, it really inspired students to move in a direction that was productive.”
Wirkkala agreed, noting that using AI for simpler math problems can afford mathematicians more time and energy to find creative solutions. This, he says, isn’t a cheat code and still requires and understanding of the concepts involved.
“If you don’t know enough about something and you plug it into AI, you’re going to get the rote answer,” he said. “But, if you understand the subject well, it opens up avenues for creativity as opposed to being rigid. It offloads a lot of the technical work.”
Still, the panelists agreed that the adoption of AI in the classroom should be approached with caution. The well-documented biases of AI models and their potential to hinder students’ critical thinking and creativity were among a number of concerns expressed.
Witte, dean of the College of Education & Human Development, said there are real concerns about the impact AI will have on students. Moreover, she said, AS literacy — just like media literacy — should be a priority for preparing students.
“I think that there’s a legitimate concern about the lack of critical thinking and creativity, especially in K-12 schools with testing and assessment,” Witte said. “I would argue that AI is just exacerbating those concerns.We have a real ethical duty as educators to allow space for young people to figure out what AI means and how to use it critically.”
Appropriately for someone whose work involves training teachers, though, she believes that there are avenues for AI to bring people together if their minds remain open.
“If we can all think of ourselves as learners in a learning space, whether it be a classroom, a lab, a college course, I think collaboration just very easily will emerge from that.”
