Generative AI is here and is creating challenges in higher education (Balch, 2023). As instructors, we are struggling with the need to teach our students necessary and marketable skills for their evolving world (e.g., Balch & Blanck, 2025; Butulis, 2023; Parks & Oslick, 2024; Schoeder, 2024), while maintaining academic rigor and integrity. Our students are aware of and are using generative AI in many ways (Kichizo Terry, 2023), and generative AI may change the way that we and our students perceive the role of instructors in higher education (Saucier, 2025). I share my recent experiences with and reflections on my students using generative AI in my classes below in the hopes that the lessons I have learned will help other instructors navigate similar experiences with students in their classes.
At the end of the semester, I received an email from a student in my general psychology course to ask me what he should do about a paper he wanted to turn in. The paper was an article summary and reflection paper that students can complete in my course in lieu of participating in research. The research participation requirement, and thus these papers, do not count for points – the research requirement just has to be completed (I will note here that this is a department policy for all general psychology courses – not a policy I personally created for my course). He ran his paper through an AI checker “just for fun” and it reported his paper was 43% written by AI. He wondered how to proceed because he said he did not use AI to write his papers and did not want to get into trouble. Knowing that AI checkers are not that reliable, I told him that if he had not used AI to write his paper, then he would be fine. He submitted the paper (and a couple of others) and I quickly forgot about his email in the chaos of the end of the semester.
In processing the papers submitted by the students in my class with an enrollment of 206 students, I later discovered an issue. The paper assignment asks students to retrieve, read, and write a summary and reflection on research articles published in the current calendar year from one of two specified journals. The first part of the assignment asks them to provide the reference for the article they used for their paper. Due to my questionable decision not to spend much time on APA format in my course, many of the students provided references in various states of incompletion and disrepair. Consequently, I copied and pasted the article information they did provide into Google Scholar to check to see if the articles used for each paper were in fact from one of the specified journals for the current calendar year. During this tedious task, I found that some articles were not from the specified journals, and some were not from the current calendar year. This was not surprising. What was surprising is that I found that some of the articles (including the three used by my student above) did not exist.
Over many email exchanges and Zoom meetings over the next several days with two students who used non-existent articles for their papers, I discovered that each had used generative AI to complete their papers. Three times, the student above put the directions from my posted assignment into generative AI, and each time generative AI obligingly produced something the student thought was an article from one of specified journal in the current calendar year. The articles were authored by “Smith”, “Johnson”, and, my personal favorite, “Doe.” The student then clarified that he requested a summary of each article, not the actual articles to help him write the papers in his “own words.” What was interesting to me was, that in reviewing the papers submitted, I do believe the student wrote the papers as I directed, but used the article summaries provided by AI rather than the full articles. I fully acknowledge that I cannot perfectly determine by reading them if papers were written by students or by AI, but the ideas, language, and grammar errors in the papers lead me to believe my student on that point. My student did acknowledge he knew this was wrong, but was overwhelmed by his workload at the end of the semester and made a bad decision for papers worth no points. The other student told me that he uploaded articles into generative AI and asked for summaries that fit the assignment directions, and said that the AI botched the references which was why the papers did not exist. Given that the articles the student later provided me that he said he uploaded to AI had different authors and topics from what he reported on his originally submitted papers, I suspect the student was not completely honest with me.
Both students seemed genuinely shocked that AI would retrieve articles for them that did not exist. Both students acknowledged that they knew what they were doing was not allowed and that they had violated our university’s honor code. Both students noted that end of the semester stress and the low stakes of the assignment were factors in their decisions to employ AI. Especially given that the papers they had used AI for were worth no points in my course, I treated these as teachable moments. I discussed with each student the issues with their using AI and they seemed receptive to this conversation. Each promised to not use AI for coursework ever again. I am not optimistic enough to think that this is a binding agreement, but I do think both students were impacted by this experience. I allowed each student the chance to redo their papers without the use of AI.
Lessons Learned
These experiences were unsettling for me and caused me to think deeply about AI and how I could adjust my courses to avoid future situations like these. These are a few of the lessons I will take with me from these situations:
First, I will have a useful AI policy in my courses that helps students better understand what is and is not allowable.
Every course should have a clearly stated AI policy – and my course did. My AI policy stated that students are prohibited from using AI for any coursework unless I explicitly tell them they may do so. I thought this provided enough guidance for my students (and flexibility for me if I ever wanted my students to use it for something), but the first of my students above saw no ethical difference in using AI to find research articles than in using a database like Google Scholar or PsycInfo. Rather than generally stating to “use library resources,” providing explicit directions to use those specific library resources would have been valuable. Also, discussing how AI may or may not be used for “idea generation” (and what that means) would be valuable, versus using it to complete coursework.
Second, I will discuss why I’m not allowing AI for at least some of the coursework in my class.
While AI and the skills to use it effectively and appropriately are important, they are not important for everything. I will discuss with my students the necessity of their independently learning content knowledge and skills without the use of AI. Relatedly, I will discuss with my students that AI often is not able to do the specified tasks reliably (e.g., finding relevant articles that actually exist). I will explain to my students that my policies and course decisions are designed to help them learn, not to provide barriers to their success. This will highlight that my purpose is to engage and support my students in their learning (Saucier, 2019; Saucier, 2022).
Third, I will have clear and actionable plans for actions and consequences if students violate my stated AI policy.
One of the issues I encountered was what to do with the students who violated my AI policy. When reporting cases to our university Honor and Integrity system, we are asked to recommend a sanction. I was not sure what an appropriate sanction would be in these situations in which students used AI inappropriately to complete papers worth no points. I could have recommended that the students fail the course due to an academic integrity violation, but this seemed too harsh to me (and I acknowledge some readers may disagree) without my having indicated in my syllabus that this penalty would be sought in cases like these. My future AI policies will be more explicit in how I pursue consequences when the policies are violated.
Fourth, I will try to support my students by reducing end of semester stress in my courses.
As part of empathetic course design (Saucier, Jones, Schiffer, & Renken, 2022), I recommend spacing out assessments. I recommend using more lower stakes and fewer higher stakes assessments. I recommend not having a high proportion of course points due in the last weeks of class. I recommend using the minimum effective dose of assessment, such as by setting maximums (e.g., no more than two pages) rather than minimums (e.g., at least three pages) on assignments – which is a decision that also streamlines and positively affects my experience in processing my students’ work.
Conclusion
Generative AI is here and our students are using it in our courses. We should anticipate having to deal with our students using AI in ways that may surprise us. Accordingly, our policies and teaching practices should be designed to help them (and us!) navigate the situations in which our students use AI in unauthorized and inappropriate ways. This will make their experiences better as learners and our experiences better as teachers.
N.B. I uploaded my conclusion to ChatGPT and asked if it agreed. It did, concluding that, “The goal should be to help students become savvy, ethical users of AI while maintaining the integrity of the learning process.” Well said.
Donald A. Saucier, Ph.D. (2001, University of Vermont) is a University Distinguished Teaching Scholar and Professor of Psychological Sciences at Kansas State University. Saucier has published more than 100 peer-reviewed journal articles and is a Fellow of the Society for Personality and Social Psychology, the Society for the Psychological Study of Social Issues, the Society for Experimental Social Psychology, and the Midwestern Psychological Association. His awards and honors include the University Distinguished Faculty Award for Mentoring of Undergraduate Students in Research, the Presidential Award for Excellence in Undergraduate Teaching, and the Society for the Psychological Study of Social Issues Teaching Resource Prize. Saucier is also the Faculty Associate Director of the Teaching and Learning Center at Kansas State University and offers a YouTube channel called “Engage the Sage” that describes his teaching philosophy, practices, and experiences.
References
Balch, D. E. (2023, August 25th). Artificial intelligence: The rise of ChatGPT and its implications. Faculty Focus. https://www.facultyfocus.com/articles/teaching-with-technology-articles/artificial-intelligence-the-rise-of-chatgpt-and-its-implications/
Balch, D. E., & Blanck, R. (2025, March 31st). AI-powered teaching: Practical tools for community college faculty. Faculty Focus. https://www.facultyfocus.com/articles/teaching-with-technology-articles/ai-powered-teaching-practical-tools-for-community-college-faculty/
Butulis, M. (2023, December 6th). Embracing artificial intelligence in the classroom. Faculty Focus. https://www.facultyfocus.com/articles/teaching-with-technology-articles/embracing-artificial-intelligence-in-the-classroom/
Kichizio Terry, O. (2023, May 12th). I’m a student. You have no idea how much we’re using ChatGPT. Chronicles of Higher Education. https://www.chronicle.com/article/im-a-student-you-have-no-idea-how-much-were-using-chatgpt
Parks, M., & Oslick, M. E. (2024, May 22nd). Empowering student learning: Navigating artificial intelligence in the college classroom. Faculty Focus. https://www.facultyfocus.com/articles/teaching-with-technology-articles/empowering-student-learning-navigating-artificial-intelligence-in-the-college-classroom/
Saucier, D. A. (2019, September 9th). Bringing PEACE to the classroom. Faculty Focus. https://www.facultyfocus.com/articles/philosophy-of-teaching/bringing-peace-to-the-classroom/
Saucier, D. A. (2022, February 23rd). Bringing PEACE to support all students. Inside Higher Ed. https://www.insidehighered.com/views/2022/02/23/professors-should-learn-about-respond-students-unique-experiences-opinion
Saucier, D. A. (2025, May 12th). What can college instructors offer their students in the age of AI?. Faculty Focus. https://www.facultyfocus.com/articles/teaching-with-technology-articles/what-can-college-instructors-offer-their-students-in-the-age-of-ai/
Saucier, D. A., Jones, T. L., Schiffer, A. A., & Renken, N. D. (2022). The empathetic course design perspective. Applied Economics Teaching Resources, 4(4), 101-111.




















