I did a consultation recently where an instructor asked me some great questions about how to implement using generative AI in the classroom, so I thought I would share some of the key points of that discussion here: There’s a lot of hype about what AI can and cannot do flying around the interwebs, and various websites will alternate between posting that AI is overhyped and useless one week and then promote Sam Altman propaganda about AI solving all human problems the next. As usual, the truth is somewhere in between.
So how do we as instructors teach students how to navigate a future with AI in it?
I would argue that we have to teach students to be able to—wait for it—think critically about AI. This does not mean rejecting AI wholesale, refusing to use it in our classrooms, or unthinkingly equating it with cheating. Instead, students benefit from being able to explore AI in a way that teaches them that it is a tool. Like Promethean fire, tools can be used to cook your food or—misused—send you to the burn unit. Similarly, we are already seeing disparities in the skill level of students using AI: I have met students who are teaching themselves Python using ChatGPT and students who have never used Copilot because of fear of being accused of cheating. If, as the Amazon study above suggests, 73% of employers will be using AI in the near future, how do we address these disparities and better prepare our students?
One approach is to critique AI in our classrooms. This can be a brief or as involved as you choose to make it. In some fields, GenAI will regularly hallucinate when you give it domain-specific questions. If this is true, take some time to show students that generative AI is not an all-seeing, all-knowing, unerring “genius” that has all the right answers (as purported in this popular YouTube video). Our students need to know that generative AI is sometimes bad at math, cannot write lab reports, and can’t explain cavitation to a non-expert. At the same time, AI regularly scores better than college students on papers and whizzes by the bar exam.
So, an important step in helping students develop AI literacy is teaching them to apply critical thinking to its outputs. There are many ways to do this, but an easy first assignment is to supply prompts and responses for students to evaluate. In my Intro to Writing course last semester, I started by using Copilot to write a summary of an essay they had read. I then assigned a worksheet for students to complete, asking them to critique the AI-generated summary, guided by questions such as, “What does this essay do well?” and “What important information is missing?” Helping students identify gaps in what AI does or doesn’t produce identifies a role that they can play in a future with AI.
For more information, we also have an upcoming workshop on Transparent Teaching with AI, or to learn more about using generative AI in your classroom, read our previous blog posts or watch recordings of our Faculty Showcases on AI assignments and Large Enrollment Courses. You can also contact the CITL with questions or for a personal teaching consultation on how to get started or feedback on a lesson plan that includes using generative AI.
References:
Bowen, José Antonio and C. Edward Watson. Teaching with AI: A Practical Guide to a New Era of Human Learning. (Johns Hopkins UP, 2024).
Lucariello, Kate. “New Amazon Study Reveals How AI Will Transform the Workplace in Five Years,” Campus Technology, 2 Jan 2024.
Wasick, Steve. “Why ChatGPT Can’t Write Your Reports.” Infosentience, Mar 22, 2023.
Wiggers, Kyle. “Why is ChatGPT so bad at math?” TechCrunch, Oct 2, 2024.
Leave a Reply