Are you one of the reported 61% of higher education faculty now using AI in your teaching (Weaver, 2025)? A recent survey by the Digital Education Council (2025) found that 88% of AI-using faculty report minimal to moderate use. Further, 83% of faculty question students’ ability to evaluate AI-generated content, and 82% worry about student overreliance on AI tools.
So, while a majority of faculty are incorporating AI, many of us remain cautious about how to use it effectively in our higher education classrooms. This tension is further echoed in a recent 2025 EDUCAUSE AI Landscape Study, which reports 57% of schools, up from 49% last year, now identify AI as a “strategic priority” as they continue to adapt to the expanding impact of AI across teaching and learning (Robert & McCormack, 2025).
Our institutions want us to use AI in our classrooms, but how can we do this well? Research by Zhou and Peng (2025) found that AI supported instruction can enhance both student engagement and creativity, especially in creating personalized and collaborative learning experiences. Similarly, Walter (2024) found that training educators and students in prompt engineering and critical evaluation is a key component necessary to maximize AI’s potential while reducing risks of misuse and over reliance. To enhance our content, we need to think about how to use AI purposefully, training both ourselves and our students to engage with AI tools critically, creatively, and ethically.
This article examines how faculty can incorporate AI tools effectively into their disciplines, while guiding students to use AI to foster critical thinking and creative application. Drawing on my own research, it offers strategies to support thoughtful integration of AI into higher education classrooms, with a focus on ethical awareness and responsive instruction design.
What I Learned From Using AI in My Teaching
Over the past school year, I used AI as a tool in my undergraduate courses and found that students were not as adept at using AI as I had suspected. In fact, when I introduced AI as a required component of the course framework at the start of the semester, many students were uncertain how to proceed. Some shyly offered that they had used AI in courses previously, but many were hesitant, having been repeatedly warned that using AI could jeopardize their academic careers. Without explicit, scaffolded instruction, both students and faculty risk using AI superficially, missing its potential to meaningfully transform teaching and learning.
When AI Becomes the Assignment
In Spring 2025, I led a research project in my classes exploring how university students used AI tools, such as ChatGPT, to support iterative writing and refining complex tasks like lesson planning. The study emphasized ethical AI use and focused on prompt engineering techniques, including the use of voice-to-text, targeted revision, and staged feedback loops to improve idea generation, structure, and differentiation. I wanted students to engage in a critical evaluation of AI outputs, developing greater precision and agency in applying AI suggestions across drafting stages.
What I found was that students did not initially know how to talk to AI, rather they talked at it. At first, students did not get useful results because they were not tailoring their prompts enough. One student offered “I had to ask the same question 50 billion different ways to get the right answer.” What I discovered over those first few weeks was that students needed to learn to dialogue with AI in the right ways. They had to be intentional in what they were asking it and tailor their prompts specifically.
Try this instead:
Begin broad, then refine. Encourage students to start with a general idea, then narrow their prompts based on assignment goals and relevance of the AI’s output.
Promote iterative prompting. Teach students to revise their prompts by engaging in an ongoing process of dialoguing with AI, aimed at narrowing down their ideas. Author WonLee Hee (2025) offers the following framework: prompt, generate output, analyze, refine prompt, and repeat.
Why Prompting Is Worth Teaching
Students are using AI, but often without the skills to do so effectively—and that is where we come in. Poor prompting reinforces the very over-reliance that faculty fear, training students to accept whatever results AI delivers, rather than critically questioning them. When prompts are vague or generic, the results are too.
Students need specific instruction on how to prompt AI effectively. In my classes I used a structured, multi-step process that students followed each week. However, after reviewing student feedback and surveys, I realized that the process involved too many steps. If I wanted my students to use AI meaningfully beyond my course, I would need to refine and simplify the approach.
Try this instead:
Incorporate guided practice. Use a consistent AI tool at the start of the semester (I used ChatGPT) and model effective prompting and revision to help students build foundational skills.
Gradually increase student choice. After the initial learning phase, allow students to mix and match AI tools to personalize the process and deepen their engagement.
Embed critical reflection. Encourage students to treat AI as a thinking partner, not an all-knowing source. Design assignments so that they require ongoing interaction with AI (Gonsalves, 2024), such as using AI to generate counterarguments to their own essays or applying math concepts to real-world problems to identify gaps or misunderstandings in their thinking.
A Simple Framework for Better Prompts
A simple, three-phased framework will be more user friendly.
Explore: Encourage students to begin by collecting and thinking through wide-ranging ideas. Start with speech-to-text to brainstorm. Then narrow the focus, identify gaps, and use AI to help fill them.
Refine: Have students evaluate the AI outputs and add specific details to further improve clarity, accuracy, and relevance.
Revise: Use AI to check if ideas have been clearly communicated. This type of editing involves more than fixing grammar, it is about making sure that their message is clear, focused, and appropriate for the audience.
What Changed for Students
When I incorporated these changes, I saw that my students became more strategic thinkers and were less likely to merely copy from AI. In fact, over 73% of my study participants noted that they stopped accepting AI’s first response and began asking better follow-up questions, indicating that they were dialoguing with AI rather than just copying from it. Repeated practice helped them yield more accurate AI generated support and emphasized their importance in the process. They came to view AI as a support tool not a substitute for their own ideas. At the end of the study, one student noted “You have to be very specific… I have learned how to tweak my prompt to get the result I want.” Another, stated that “I started editing ChatGPT instead of letting it write for me.” These responses indicated a key shift: better prompting had reframed AI as a collaborator, not a crutch.
Final Thoughts
Teaching students how to create effective prompts is not about using technology, it is about teaching them to craft better questions. This practice reinforces critical thinking skills so many of us aim to develop in our disciplines. When students learn how to guide AI, they are also learning how to refine their own thinking. Encouraging reflection throughout the process fosters metacognition; by regularly engaging in this type of analysis of their decisions and ideas, students become more thoughtful, independent learners. By intentionally incorporating AI tools into our coursework, we are reducing the temptation for misuse and overreliance, creating space for more ethical and transparent use in our higher education classrooms.
AI Disclosure: This article reflects collaboration between the human author and OpenAI’s ChatGPT-4 for light editing. All ideas, examples, and interpretations are the author’s own.
Lisa Delgado Brown, PhD, is a current Assistant Professor of Education at The University of Tampa and the former Middle/Secondary Program Administrator at Saint Leo University where she also served on the Academic Standards Committee. Dr. Delgado Brown teaches literacy courses with a focus on differentiation in the general education classroom.
References
Gonsalves, C. (2024). Generative AI’s Impact on Critical Thinking: Revisiting Bloom’s Taxonomy. Journal of Marketing Education, 0(0). https://doi.org/10.1177/02734753241305980
Lee, W. (2025, April 29). Prompt engineering #5: Optimizing AI interactions through iterative prompt adjustments. Medium. https://medium.com/@whee.2013/optimizing-ai-interactions-through-iterative-prompt-adjustments-9ec0974ee821
Robert, J., & McCormack, M. (2025, February 17). 2025 EDUCAUSE AI landscape study: Into the digital AI divide. EDUCAUSE. https://www.educause.edu/content/2025/2025-educause-ai-landscape-study/introduction-and-key-findings
Walter, Y. (2024). Embracing the future of artificial intelligence in the classroom: The relevance of AI literacy, prompt engineering, and critical thinking in modern education. International Journal of Educational Technology in Higher Education, 21, 15. https://doi.org/10.1186/s41239-024-00448-3
Weaver, J. (2025, February 5). Digital Education Council: Global AI meets academia faculty survey 2025. IBL Education. https://ibl.ai/blog?story=digital-education-council-global-ai-meets-academia-faculty-survey-2025
Zhou, M., & Peng, S. (2025). The usage of AI in teaching and students’ creativity: The mediating role of learning engagement and the moderating role of AI literacy. Behavioral Sciences, 15(5), 587. https://doi.org/10.3390/bs15050587