Artificial intelligence (AI) platforms, especially chatbots like ChatGPT and Gemini, are influencing many areas of higher education. Students and instructors can interact with these tools to get real-time, personalized help with nearly any academic task at hand, including helping students study or revamping course content for instructors. There is no question that the educational potential is significant, but so too are the concerns about academic integrity and the consequences of students relying too heavily on these tools. While it’s important to weigh the pros and cons of chatbots, using the right strategies and technologies can help avoid those issues and still allow faculty and students to benefit from this burgeoning technology.
Use Cases for Chatbots in Higher Education Teaching
First, let’s consider uses among faculty. They can use chatbots to help improve most areas of teaching and learning while still saving time.
Chatbots can assist with different phases of course design, including writing learning objectives and developing and repurposing content. Chatbots can also be used in evaluating student performance data if your institution offers a secure chatbot (just be sure to anonymize the data). Instructors can also use chatbots to review the rules and instructions for their assignments and tests to identify vague or subjective text and suggest clearer, more objective wording.
But one of the most impactful uses is adapting course content into formats that support students with different learning needs. Chatbots can adapt, simplify and organize text for students with learning or cognitive disabilities, generate alternative text for multimedia content, and they can even adjust their responses based on student emotions (for example, offering words of encouragement when a student is frustrated or discouraged).
Instructors may be tempted to use chatbots to grade written coursework as well, but for now, it is important to note that the technology still struggles with the nuances of grading and feedback. However, chatbots can still help by pulling out the main ideas and revealing whether a student’s essay meets basic criteria, which gives instructors a clearer starting point before reading.
You’ll see a lot of jargon and articles that make prompt writing seem complicated, but you don’t need to be an expert to write effective prompts. It’s a skill you can pick up quickly after a few tries, especially once you find what works for you and start creating your own templates to use in your courses.
Students and Chatbots: Finding a Balance
There are many ways that students can use chatbots to support learning while steering clear of misusing them for tests and assignments. Ensuring ethical and beneficial adoption of AI requires clear strategies and guidelines along with the right technology.
Improving Academic Performance
Recent research indicates that chatbots can improve learning and academic outcomes, but balance is key. When students rely on them too heavily, any benefits, like increased engagement, motivation and reflection are reduced or eliminated. When used appropriately, AI chatbots can tutor students to help them better understand and comprehend the information, which also increases engagement. Chatbots give students instant, personalized help with tasks such as summarizing content and checking grammar, and they also create a nonjudgmental space where students feel more comfortable asking questions.
Cognitive Load Reduction and Stress Management
Chatbots are ideal for offloading tasks such as summarizing a lecture transcript or highlighting key points in an assignment. With students under pressure to complete assignments, leaning on AI can also reduce stress and anxiety by giving them more time to focus on the important elements of an assignment.
Similarly, chatbots can help students visualize information by generating charts, graphs or images to support their ideas. They can also provide adaptive support for students with disabilities or language barriers by reading, translating, simplifying or reformatting content to meet their needs. This reduces cognitive load, allows students to focus on the assignment itself and builds confidence in their ability to complete the work.
Strategies for Reducing Chatbot Misuse
Because of the speed and ease of use for chatbots, students may begin to use them as a shortcut to get work done instead of a supplemental learning tool. That’s a big issue. It bypasses critical thinking, contextual understanding and collaboration. This not only cheats the student out of learning but creates academic integrity issues for higher education institutions to address and stay ahead of.
Cheating is common in higher education, and with recent surveys indicating that more than 85% of students use AI daily to help with schoolwork, there’s a good chance of overlap. Students have mixed opinions when it comes to using AI for homework, with around 40% of students acknowledging that using AI for research and writing assistance should be acceptable, but there should also be limitations and ethical use.
Communicating with students about academic integrity codes of conduct, AI use, and cheating policies and ramifications are the first steps in setting expectations. But delivering assignments in ways that reduce opportunities for the use of chatbots or that lay out specific guardrails for chatbot use can also be helpful.
Reducing Chatbot Cheating
Cheating in higher education will never completely go away, and efforts to maintain and manage tools and channels used for cheating must be ongoing. As more students leverage chatbots for both approved and unapproved use, educational institutions can work to help students understand the appropriate way to leverage technology within the learning environment.
While educators struggle to detect AI use reliably, AI detection tools can identify unedited chatbot-generated content with reasonable accuracy. But their effectiveness drops when students edit or modify responses or use AI paraphrasing tools to rewrite the content. They’re more useful as a gut check, not definitive proof that the text was AI-generated.
Remote proctoring during assessments and written assignments adds another barrier to cheating. Ideally, the proctoring solution should leverage AI test monitoring together with live human proctors, which gives faculty the control to prevent AI use and the flexibility to allow approved resources. For example, instructors can proctor exams or essays by restricting access to all unauthorized websites and software, including chatbots, while still providing access to specific materials like case studies or tools like Word, Excel and Google Docs.
Another option is to use scaffolded, realistic assignments and assessments that focus on real-world application and requiring students to connect course content to their personal experiences or context. This makes it more difficult for chatbots to generate accurate or meaningful responses. For example, assessments could ask students to develop a scenario based on a local community issue, work on collaborative projects that include peer reviews or create video responses that reference specific class lectures or materials. This approach helps evaluate students’ true understanding of materials and that they are developing practical skills.
Set the Stage Now for an AI-Inclusive Future
AI chatbots are powerful tools that can make education more personalized, accessible and engaging. However, their misuse can undermine academic integrity and dilute learning. The solution, in most cases, isn’t to ban them outright, but to integrate them responsibly throughout teaching and learning processes. Striking this balance isn’t always easy, but it’s necessary to preserve the value of learning while preparing students for a future where AI will almost certainly be a part of their work.
Tyler Stike is the Director of Content at Honorlock and a doctoral student in educational technology at the University of Florida. In his role at Honorlock, he develops a wide range of content on online education, assessment, and accessibility. He is interested in how affective states influence learning and performance, and plans to research how AI can support adaptive learning experiences that help students manage those states.
References
W. Dai et al., “Can Large Language Models Provide Feedback to Students? A Case Study on ChatGPT,” 2023 IEEE International Conference on Advanced Learning Technologies (ICALT), Orem, UT, USA, 2023, pp. 323-325, doi: 10.1109/ICALT58122.2023.00100.
Strike, Tyler. “AI Prompting Examples, Templates, and Tips For Educators.” Resources for Education & Assessment, March 27, 2025, https://honorlock.com/blog/education-ai-prompt-writing/
Sánchez-Vera, Fulgencio. 2025. “Subject-Specialized Chatbot in Higher Education as a Tutor for Autonomous Exam Preparation: Analysis of the Impact on Academic Performance and Students’ Perception of Its Usefulness” Education Sciences 15, no. 1: 26. https://doi.org/10.3390/educsci15010026
Kelly, Rhea. “Survey: 86% of Students Already Use AI in Their Studies” Campus Technology, August 24, 2024 https://campustechnology.com/articles/2024/08/28/survey-86-of-students-already-use-ai-in-their-studies.aspx
McKearin, Cheryl. “Report on Student Attitudes Towards AI in Academia,” Learning Technology Solutions, April 5, 2024, https://learning.uic.edu/news-stories/report-on-student-attitudes-towards-ai-in-academia/
Kofinas, A. K., Tsay, C.-H., & Pike, D. (2025). The impact of generative AI on academic integrity of authentic assessments within a higher education context. British Journal of Educational Technology, 00, 1–28. https://doi.org/10.1111/bjet