Teaching With and Around ChatGPT and Other Tools

21 December 2022
By Victoria Morse, Director of the LTC and Professor of History

A Highlight from the December 2022 LTC Conference.

At the conference, we wanted to start answering the question “how can we teach critical thinking and digital literacy using AI tools?”  Read on for the rationale, for some initial experiments, and for some resources!

If you have been following the news (including a recent email from David Liben-Nowell) about the fast-paced improvement of artificial intelligence tools designed to produce written text, you will have noticed the explosion of interest and concern in the last few months. Joining tools that solve math problems, write computer code, and generate art and music, tools like ChatGPT from Open AI are becoming more able to turn a prompt into a passable essay, write a blog post, outline a paper, and generally “do” many of the writing tasks that we assign to our students.

Reactions in the Chronicle of Higher Education, Inside Higher Ed, on podcasts and blogs, and in the mainstream media range from the enthusiastic to the horrified. A lot of the concern centers on whether we will be able to distinguish our students’ writing from AI-generated writing and what the consequences will be for our expectations about academic honesty. There is somewhat less attention to how these tools may influence students’ intellectual development and ability to use writing to deepen their understanding of a topic.  But there are also a number of articles offering interesting suggestions for how we can incorporate these tools into our teaching, much as we have with spell check.

At the LTC December conference this year, a group of faculty and staff spent some time thinking about how we can write assignments that ask students to play with AI tools in ways that sharpen their critical thinking and digital literacy skills.  Colleagues proposed that students:

  •  Use their knowledge of art history and their observational skills to prompt  DALL·E 2 to recreate an assigned artwork
  • Study a translation of poem created with Google Translate, explain the choices the tool made, and make their own translation
  • Generate 3 different essay outlines using GPT-3 (the earlier version of ChatGPT); compare the approaches, and reflect on how AI went about the task
  • Check the solution offered by Wolfram Alpha to a problem about mortgages and interest rates, compare it with the answer you would get by Googling, and think about how the answer a general audience might want vs what a college-level math professor is looking for

These are first thoughts for new assignments created in a constrained amount of time, but they give an idea of some of the ways in which we can help students see both the strengths and the limitations of the tools and still do the careful thinking about audience, evidence, and truth that we want them to do. 

Whether you believe we should act to develop students’ independent skills (for example by having them do more of their writing in class) or whether you believe that the tools can become useful partners in the writing process, we need to pay attention and think through what learning outcomes we want for our students in our courses and how our policies and assignments can bolster that learning.

Image created by Dall-E from the prompt "create a pop art painting of professors experimenting with AI tools
When I asked Dall-E to create a pop art painting of professors experimenting with AI tools, two of the three images showed only white, (likely) male-identifying “professors” (apparently signaled by jackets and ties).
Image created by Dall-E from the prompt "create a pop art painting of professors experimenting with AI tools
One image included Black professors and less formal clothing.  No women, not to mention the complexities of quick signaling of gender, racial, and other identities.  Clearly, I would need to be very specific with my prompt to get an image that matches my own image of what “professor” can mean.