Published on
What Happens When AI Enters the Classroom

As generative AI (GenAI) becomes a staple in student learning, institutions are grappling with how to adapt their pedagogies without compromising academic integrity or human creativity. This shift presents a unique opportunity to reimagine the role faculty, curricula and even the humanities play in shaping thoughtful, critically engaged learners. In this interview, Victor Taylor discusses the impact of students using generative AI tools and the humanities’ evolving role in an AI-driven landscape.
The EvoLLLution (Evo): What are the big issues in GenAI technologies and how do they connect to student academic success?
Victor Taylor (VT): One issue that receives a great deal of attention is how fast the technology is advancing and how accessible it is to students. AI learning apps proliferate, and students can customize products from vendors to accomplish all kinds of learning goals. It’s not surprising that a very high percentage of college students report using GenAI in their courses—up to 80% according to a recent Chegg survey. They use it for note taking, creating podcasts and researching, as a search engine primarily, and they also use it to complete their assignments. I think there’s one overarching concern regarding the latter—the extent to which students are using artificial intelligence technologies to support and not replace their own thinking and effort around academic work.
Evo: What informs your interest in GenAI and student success?
VT: To provide some context, my background is interdisciplinary humanities, and for many years I taught courses in literature, philosophy, religion and writing. When we consider what academic areas GenAI applications for student learning have most significantly impacted, it is safe to say it is largely the above. We’re now seeing a heightened interest in ethics and technology that goes beyond the topic of plagiarism.
In this sense, the humanities have an enhanced role to play in understanding the future of artificial intelligence. For example, students widely use GenAI tools that can quickly and accurately summarize and provide outlines for complex and lengthy texts like poems and philosophical works to assist with reading assignments in text-centered fields. Teaching critical or close reading skills, a foundational part college-level learning, needs to account for this use, and we need to develop new pedagogies to reaffirm the importance of student literacy and analytical proficiency. Some of my colleagues may disagree with me, but I think the rise of artificial intelligence could be a great opportunity to show the value of the arts and humanities and the overall importance of human creativity and thinking.
Evo: Platforms like ChatGPT and Storm have become very good at writing in specific disciplinary contexts. How is higher ed addressing that?
VT: What is true for reading is also true for student writing, which is everyone’s concern across disciplines. This technology can generate assignments instantly and, depending on the quality of the prompt, the result can be very good. ChatGPT has written essays that have passed high-level professional examinations. Considering this assessment, we can ask the following: Are students writing their own papers and lab reports? Or, more accurately, how much and in what ways are students utilizing GenAI when writing papers, lab reports, etc.? The first question assumes academic dishonesty, which I don’t think is helpful. The second question captures the reality of GenAI use in academic writing and the students’ well-intentioned adoption of an effective technology. In both reading and writing situations, we need to better understand why and how students use GenAI tools in their academic work.
Now that I’m in an administrative position and leading a working group on GenAI at the university, I’m seeing these issues across campus. From this perspective, my approach is to advocate for developing as part of our shared learning outcomes student GenAI literacy and competency across curricula. The proverbial toothpaste is out of the tube and, instead of watching an endless battle between GenAI use detection software and GenAI use detection avoidance features, let’s teach students to use these powerful tools appropriately and ethically to augment their learning experience.
I’ve oversimplified it in my answer, of course, but moving in the GenAI literacy and competency direction more effectively addresses concerns about students overusing or over relying on or even misusing artificial intelligence technologies—technologies that will only become more powerful, easy to use and advanced.
Evo: We don’t want to replace student thinking with GenAI, so how do we integrate GenAI as a tool to enhance teaching and learning without replacing human educators?
VT: That is the other major concern in higher education: the extent to which GenAI or, in the future, GAI—general artificial intelligence—will replace faculty or human workers in general. If GAI will someday do half of what technology experts and computer scientists predict, it will be a serious concern, but we aren’t there or close to being there yet. However, the question points us in the right direction. GenAI or a future GAI is a tool, and humans have evolved with and through tools. I predict we’ll continue to do that. Of course, enhanced artificial intelligence will be a tremendous challenge because it’s a tool that is or will be a simulation of us—a co-intelligence. Nevertheless, artificial intelligence will serve a human interest—unless we cross over into Philip K. Dick territory.
This is where ethics will be paramount. As we create these tools, we will need to have guardrails and constraints. The subdiscipline of ethics can inform and shape what the future of artificial intelligence looks like. For instance, we can entertain two questions: “How can we advance artificial intelligence to replace humans in the workforce?” or “How can we advance artificial intelligence to support and assist humans in the workforce?” The second question is the more interesting one to think through and a more likely scenario, I hope. It is often said that GenAI will not replace you in your job but someone, a person, who knows how to use GenAI effectively and in an advanced way will.
Evo: How can co-intelligence with GenAI support personalized learning experiences?
VT: I’ll resist a futurist impulse here, but imagine a slightly more advanced version of our current options for GenAI that could significantly function as a friction reducer for faculty and students. For example, a professor could use GenAI today to customize individual student learning. This application could be significantly augmented in the future to re-enforce spiraled or interconnecting curricula and student learning outcomes. I regularly taught a 200-level critical theory course that covered many major approaches to literature like psychoanalysis, feminist theory, postcolonial and decolonial theory. In 300- or 400-level courses, the same students would encounter authors utilizing or referencing these theories. So, imagine implementing agentic artificial intelligence—an AI bot—that would function as a personal learning assistant.
The AI bot could email or call the student saying something to the effect of, “Good morning, Kelly, tomorrow’s reading assignment for LIT 400 references the work of Slavoj Ẑiẑek, a major Lacanian thinker. Would you like me to send you a link to the paper you wrote about him in LIT 210, along with your class notes and professor’s slides from the psychoanalysis unit? You also have an assignment for your rhetoric class due tomorrow. It is an outline for an argumentative essay. Would you like me to send you the rubric for argumentative essays from your expository writing class last spring? Finally, don’t forget to call your grandmother. It’s her birthday today.”
Artificial intelligence in this example would greatly re-enforce and connect learning across curricula, facilitating knowledge and skills portability. The AI bot would assist the student in being much more prepared for class, which would in turn give the professor and student more time to engage with the course material more deeply. It would keep Grandma happy, too.
Evo: You mentioned that you are in an administrator role; what should university leadership do to help faculty and students adjust to the new world of artificial intelligence?
VT: When artificial intelligence was becoming more of an issue of concern across campuses, many universities formed committees and charged them with writing white papers. What is interesting is that many of the white papers resisted spelling out a GenAI-specific policy. In fact, it’s common to see in these studies a conclusion that points to the rapid advancement of the technology and the need to support faculty, students and staff thinking through GenAI’s capabilities and applications. In some instances, universities decided not to write new plagiarism policies, since the standing policy incorporated academic dishonesty. There are examples of the contrary approach, too—strict policies and prohibitions.
From my perspective the approach that makes the most sense involves conversation and bringing a wide range of constituencies together to work through present and potential future GenAI challenges. The question realistically is not if students are using GenAI in their academic work but how are they using it and why. It is very context-specific and tied to student learning outcomes and expected competencies. University leadership should encourage conversation and debate and facilitate faculty, staff and student engagement. As I mentioned earlier, we need to create a culture in which artificial intelligence is viewed as a co-intelligence, a support for human creativity and thinking, not a replacement. I’m optimistic about GenAI’s future in higher ed and its overall positive impact.