AI and Literacy in the Classroom:
What Teachers Need to Know Now about ChatGPT, Bard, and the Pipeline of AI Text Generating Tools
By Tamara Tate
March 2023
Since its free public release in November 2022, ChatGPT has made national news and been all over social media. When my 80-year-old mother started mentioning ChatGPT to me, I figured it had truly gone mainstream. In this article I will provide some general background about ChatGPT and similar AI tools that generate text and walk you through some thoughts about the best pedagogy and practices so far for teachers and students when writing with these tools. If you want a deeper dive, a preprint of our research article on the topic is available [here] and a free webinar provided by the WRITE Center can be accessed [here].
So first, teachers and students need to understand the basics of large language models and AI writing tools. You don’t need to know how to program one, but you need to have a working understanding of what they are and how they are built. ChatGPT and other large language models are a type of artificial intelligence that are trained to generate text similar to human-generated text. They are typically trained on a corpus of text data ranging from millions to billions of words–but that data may or may not be cleaned or refined (think about what it learns by reading all of the internet–there are a lot of weird things on the unsupervised internet). The network of artificial neurons is then able to generate text by predicting the next word in a sequence based on the words that came before it. This allows the model to produce text that flows naturally and is similar to human-generated text. These tools do not search the internet, do not evaluate the quality of their information, and are not necessarily up to date in their “knowledge.” ChatGPT does not think, reason, or care about truth and morality.
Teachers and students need to be able to access and navigate the tools across specific tasks–such as writing emails, outlines, generating quiz questions, summarizing information. While new to many of us, there is little doubt that these tools will be used routinely in the business world. In fact, the online version of Microsoft Word already has AI incorporated, just try out the Editor feature and the AI-generated summary feature.
Students should be prepared to use these tools effectively and ethically. Students, for example, might find that they use a search engine like Google to find information, but an AI tool like ChatGPT to summarize the articles. In addition to selecting the right tool for the job, one of the key skills we will all need to refine is our ability to generate good prompts–the quality of the question we ask the AI directly impacts the quality and usefulness of the output we receive. Practice will help us understand this better over time, but we all know people who are Google power searchers and use the tricks and refinements to find exactly what they are looking for without wading through a bunch of irrelevant output. And a note for teachers–refining searches requires critical thinking about what you know about the content area and precisely what you are looking for (don’t tell the kids). One interesting classroom use is to have students compete to generate a prompt that will gather information about the current class topic; then discuss as a class what kinds of prompts get what kinds of results.
AI text generation tools like ChatGPT will make it imperative to stress two parts of traditional academic writing: when researching, you must corroborate your sources and you must revise your writing multiple times. ChatGPT and similar tools lie–they aren’t doing it to mess with you, their algorithm just can’t discern junk from quality information. Classroom uses that support this understanding include having students fact check or revise AI-generated text (and the teacher can prompt the AI to write poorly in a specific way that targets what they want to emphasize, e.g., “pretend you are a 3rd grade student and write 200 words on the importance of the 1st Amendment with run-on sentences”).
Finally, we need to decide what it means to incorporate AI-generated or assisted text into our work–how does it get acknowledged, when is it appropriate and when is it disciplinary misconduct? This will take some time as a community to determine and should be explicitly discussed (and probably more than once, as everyone’s experience and the tools themselves evolve). These discussions need to take place in classrooms, but also in teacher lounges across disciplines. AI provides a rich context to discuss things like ethics, biases, intellectual property rights, authorship, the nature of human thought …. Don’t miss the opportunity to participate in these rich discussions!
Join leading scholars and practitioners as they discuss generative AI in education and educational research at Pens and Pixels: Generative AI in Education Online Conference July 13, click here to learn more and register.
https://uci.zoom.us/webinar/register/WN_F17LB5OvQy2XjR27hHt5Qg
Tamara Tate is a Project Scientist at the University of California, Irvine, and Assistant Director of the Digital Learning Lab (digitallearninglab.org). She leads the Lab’s work on iterative development and evaluation of digital and online tools to support the teaching and learning of writing. She studies secondary student writing as a member of the IES-funded national WRITE Center and is part of the Elementary Computing for All team. She received her B.A. in English and her Ph.D. in Education at U.C. Irvine and her J.D. at U.C. Berkeley. Dr. Tate can be reached at tatet@uci.edu and her website is at tptate.com