Learn More

40 Hour Workweek

Education Trends, Teaching Tips & Tricks, Truth for Teachers Collective   |   Sep 24, 2023

Why teachers don’t need to ban ChatGPT or AI tools in the classroom (and what to do instead)

By Candace Brown

Secondary ELA

Why teachers don’t need to ban ChatGPT or AI tools in the classroom (and what to do instead)

By Candace Brown

The first time I encountered ChatGPT, I panicked.

It was December of 2022, and I walked into a staff room to talk to a colleague, Kate, about an assignment. What I found instead were my colleagues Kate and James playing with magic: ChatGPT, an AI playground in its first stages.

Kate showed James this otherworldly site, and James tested how the AI was answering questions with intriguing bias depending on the type of question (“What’s the best way to stop terrorists?” vs. “How to bring peace to the Middle East”). Kate started explaining to me with startling calm how you could write an entire essay on ChatGPT with sources. She showed me. I curled into a ball on the staff room couch in an existential crisis.

She had a day ahead of me in thinking this through, and she attempted to assuage my fears. All I could see in my head was the play-by-play of every dystopian novel and movie I knew. You might still be there.

I hope that, by the end of this article, you will be less afraid of the apocalyptic implications (which I will still attempt to address). Instead, I hope that you will feel more prepared with how to address AI in your classroom. I also hope that you will have a healthy awareness of its presence in your students’ lives already rather than turning a blind eye.

why teachers don't need to ban chatgpt

Reasons not to ban GPT in the classroom

#1 AI is everywhere already

Arguably, as of early 2023 (which might be too broad considering how quickly things are changing), the most prominent version of artificial intelligence that our students have heard about is Tesla’s self-driving car tech (which has been in the news for both ethical considerations and car wrecks). The most prominent AI program used by students might be ChatGPT, but also might be SnapChat’s “My AI” chatbot (embedded into the app), Chat AI (advertised as an AI Chatbot and Essay Writer Bot), or Wonder (an AI Art Generator competing with the popular WOMBO Dream AI Art Generator app).

These are just the top apps in the Apple App Store. This doesn’t include Google Chrome independent extensions like Quillbot, described in its overview as a “Grammar Checker, Paraphrasing Tool & Summarizer.” The more popular Grammarly announced the launch of GrammarlyGo, an “on-demand, contextually aware assistant powered by generative AI,” in March of 2023.

#2 The positive potential is staggering

Initially, I reacted to ChatGPT with trepidation. My students don’t need to know how to write now. Writing and research standards need to get rewritten overnight. The entire white-collar workforce is in jeopardy. English teachers may as well quit. *insert dramatics here* 

However, after talking with Kate, reading about AI, and considering the implications with my staff and our English Task Force for 2023, I’m leaning on the side of hope, landing somewhere in the zone of curiosity.

The potential of AI is staggering. Several staff members at my school have likened it to the introduction of the internet as a research tool for high school students in the 90s. Initially, teachers entrenched in traditional teaching were outraged at the prospect of students skipping library day in favor of a few searches on a computer; journalists predicted that computers would never replace aspects of the traditional classroom (such as this Newsweek article from 1995). It appears that computers are not only alive and well; they are also, seemingly, alive.

However, AI has tremendous potential to reflect the darkness in humanity much like the internet. Microsoft’s Tay was taken down after only a day on Twitter as other Twitter users taught it to be racist and xenophobic in March of 2016.

AI can do a bit more than simply find your sources, though; it can search for them, filter them by any number of considerations (without paywalls), and write a full essay using all the sources with MLA in-text citations. That may sound terrifying, but it’s only as terrifying as our lack of guidance for our students. If they see AI as not only a tool with transparency but as a source, it changes how AI fits into school. I’ll explain more on this later but for now…

#3 You can “protect” your classroom (kind of…)

If you’re still in the camp of “No AI in my class” (whether by choice or by your school’s policy for the next school year), you can somewhat do that.

First, you will have to define AI and explain your AI policy in your class syllabi and in the first weeks of school. Have documentation of your policy but also announce flexibility in your policy based on AI developments.

Our English task force drafted a statement that supported a) teacher’s right to decide AI use in their classroom, b) seeing AI as a tool, c) using certain standards under Common Core to support expectations of students using their own writing only if we are assessing it traditionally, and d) using the process-of-writing standard in CC to support students learning how to use AI (i.e. ChatGPT) well. We also noted that, at the time of writing it, ChatGPT was only available legally for ages 18 and up (it was lowered to 13 on March 14 2023), so at the time, we couldn’t use it as a classroom tool. Now, we can (with parent permission), but this is a fast-evolving situation that requires vigilance.

Second, you can try to clean up on the assessment end if you’re still concerned that AI-generated content is in your students’ papers (which might be clear simply because of our usual teacher check-ins, drafting, etc.). Your usual teacher extensions that have helped you see plagiarism or cheating on Google docs are still helpful pieces to the puzzle (like the Docs extension Draftback or checking a document’s edit history). Certain tools have been created specifically to help people identify AI-generated content. The plagiarism checker Turnitin has developed a filter in its similarity report that notes the percentage of AI-generated phrases with 99% accuracy. Another colleague, P, recommended Copyleaks which doesn’t require a login like Turnitin.

However, there are limitations to using AI detectors. Google Translated material, for example, flags as AI-generated content. A student at my school once submitted a paper they initially wrote in another language for another class; the teacher’s AI checker flagged it as 100% AI-generated because of the translation.

This should be a larger theme of AI use but also tech in general: take it as a tool in conjunction with your judgment, not as gospel. Human bias is interwoven into AI with our coding, word processing, and even search results.

Another option is to switch your writing back to paper as well as changing your assessment parameters, but I caution you to consider the following:

  • Does switching back to paper remove a layer of potential learning or skill display that you were previously assessing? Consider your learning goals, standards, etc.
  • What are you teaching your students about how to navigate the world by removing the option for “safe” interaction with a new tool under a teacher’s supervision? (This might sound pointed but unintentionally so.)

The International Baccalaureate (IB) program released a statement on AI that was later published in The NY Times. Dr Matthew Glanville, Head of Assessment Principles and Practice at IB, explains that IB would not be banning AI because “that is the wrong way to deal with innovation.” He also explains that “Like spell-checkers, translation software, and calculators, we must accept that it is going to become part of our everyday lives, and so we must adapt and transform education so students can use these new AI tools ethically and effectively.”

#4 You can teach your students to use it and to use it well

If you are open to using AI in your classroom but don’t know where to start, Glanville has some pointers. He turns his attention to the kinds of learning we should be teaching our students these three things in regard to AI:

Teach how to ask the right questions and refine requests

We should be reinforcing what we teach with internet searches (like using Boolean operators, for example) but with a focus on mode (AI format of requests that can build on each other in full sentences) and learning from data that doesn’t match the request.

Teach how to identify and respond to bias in writing

As previously mentioned, AI is programmed using human input, from human coding to the internet (note that some AI programs don’t use the internet past a certain time, ChatGPT being an example- it stops around 2021 in its accuracy, according to the home page). Our students should be taught how to learn about material and text well enough that they can catch when authorial bias (or even downright false data) gets included.

For example, a colleague and I tested ChatGPT’s ability to find nuance in its explanations by asking it about our school’s history. Our school has multiple campuses, and ChatGPT combined the facts of each campus’s founding into one contradictory paragraph. It was easy to see that we would have to refine the request in a few more steps to get it to note the difference.

Teach how to “think around” problems with creativity and critical thinking

Figuring out how to get ChatGPT to note the differences between each campus’s founding in my request from the previous paragraph might be tricky, but it’s very similar to what teachers already do with Boolean phrases in internet searches as well as sources in general. Get your students to consider the following:

  • What is wrong about this answer?
  • What is this answer not including?
  • What perspective is missing in this text?
  • How can I widen or narrow my inquiry to include what I am missing?

One school policy key from Dr. Glanville’s statement is this: the IB program is considering AI a source as well as a tool. If a Math teacher has a calculator section and a “no calculator” section on their test, they are assessing how students do when other tools are enabling deeper or faster work. We can do the same with AI; we can have assessments where students can draft with AI (an “Open-to-AI” assignment rather than a “No-AI” one). We can have students create outlines for research papers using AI but require different parts of the critical thinking work without it, suggests my colleague Kate. Teachers are already aware that assessments don’t always invite learning but rather can skip over some of the critical thinking process. Brett Vogelsinger, a member of The National Council of Teachers of English, posed a few questions teachers should consider in teaching their students how to use AI in his article ”Inviting Artificial Intelligence with Curiosity”:

  • What is valuable about human writing–both for the reader and the writer–that AI cannot replicate? How will I express and demonstrate this to students?
  • How might AI be used as an insightful, knowledgeable, and blazingly efficient conference partner or tutor?
  • When is it important for a first draft to be exclusively human-created, and when is it valuable to jumpstart a draft with AI assistance?
  • How can this technology help students acquire more practice in the interesting, difficult, and meaningful work of revision by streamlining first drafts?

You should also consider adopting your own AI citing method (there is not a standardized version yet). Some teachers have suggested highlighting AI-generated phrases or going with MLA citation of AI-generated material as a source; however, the Modern Language Association says not to treat AI as an author and to simply note the use of AI with how you use it in a template structure. Keep updated on whether or not the MLA creates a different set if you choose that route.

#5 It can save you time as a teacher

Remember Teachers Pay Teachers? It’s a wonderful resource for detailed unit plans, cool explanatory posters, and detailed lessons with hyperlinks and sources to boot. However, AI is your best friend if you’re just looking for some discussion starters for your US History Vietnam War unit (guilty as charged).

Schools don’t seem to be banning AI-generated lesson plans because that’s not where bad teaching comes from. Education is a notoriously open-to-sharing-what-works community, and TPT rightfully offers teachers who create detailed work a platform to sell that work if they so choose. We can ask AI to generate parent emails, write report card comments with our comment banks, even create abridged reading schedules for that one classic novel that we know has slow parts (I’m looking at you, Frankenstein…).

Material for the purpose of education does have some freedom with it as opposed to commercial material, and we should see that freedom as a gift ready to use.

Here are a few more examples for how you can use AI in your classroom:

  • Draft fake examples of student work to have students practice identifying concepts and research flaws
  • Write fake mentor texts for students to proofread (which, Dr. Glanville notes, removes some of the ethical issues with using actual student work examples).
  • Create draft test questions (with caution…)
  • Create activities for reviewing concepts
  • Practice asking AI programs questions together to show thought processes in refining requests
  • Have students use AI to reteach themselves concepts in class before you have time to reteach (which could be a game-changer for students who prefer asking questions to learn material)
  • Use AI-powered search engines such as Semantic Scholar in research. Semantic Scholar was created by the Allen Institute which is named after the co-founder of Microsoft, Paul Allen.
  • People drafted horror stories using Shelley, “the world’s first collaborative AI Horror Writer” in 2016 which has since gone defunct. In the same way, consider using AI to help your students draft other genres of stories and have them practice identifying parts of each story’s arc.

Free teacher resource on using ChatGPT

#6: GPT provides an opportunity to learn alongside your students and model how to approach new technology

One more thing: when you use AI, strongly consider telling your students that you’re using it. It will help reinforce transparency around AI rather than secrecy. Modeling works!

I wrote this article while coming back from a vacation with my family, and we discussed AI as I researched. My brother is studying to become a graphic designer, and the prospect of AI invading my brother’s future job feels all too familiar to my mom’s story. My mom’s illustration-based graphic design degree was impractical the day she graduated because her university wasn’t teaching students how to use computer-based design, an already prominent trend. While I don’t argue that students who aren’t taught AI skills will have defunct high school degrees, I do want to prepare my students for an uncertain future to the best of my knowledge. I do not want to shy away from innovation.

Brett Vogelsinger puts it beautifully: “Encountering a new technology alongside our students puts us in a beautiful position to learn beside them. We can leverage our maturity and insight to guide them on using AI ethically, even as we strive to find our own way through the woods.”

There’s one more side to this that I haven’t addressed: the radical, near-utopian help for students with learning needs that AI can bring. For students who already have learning plans, students who know they need resources without the ability to test for learning needs, and for those who don’t even know that have learning needs, imagine the possibilities. Imagine teaching a student who would have needed a push-in aide to guide them through creating an essay outline for your very specific thesis question. Imagine that they can do that work on their own and get to the more difficult pieces with their push-in aide in half the time. That does not sound dystopian to me. That’s possible right now.

As for the apocalyptic fears of AI taking over the world, I think that every generation has had some piece of technology that radically changed their lives.  The Gutenberg Press provided knowledge to masses previously locked out from learning. Electricity provided light in the darkness. Yes, Mein Kampf was published and electricity powered the production of atomic bombs, but all technology is simply a tool. AI is the first tool that can almost look back at us, but for now, let’s teach our students how to open a book, bring light to darkness, and walk into an unknown future with curiosity and hope.

Additional resources

The AI Index: It “tracks, collates, distills, and visualizes data relating to artificial intelligence” and is sponsored by Stanford’s Institute for Human-Centered Artificial Intelligence.

The American Library Association’s page on AI: You can find it under Center for the Future of Libraries —> Trends. They list links and sources for almost every sentence. The implications’ focus is on how AI might affect library goals but it is a valuable resource for anyone interested.

The Urban Libraries Council’s Press Release on AI: this notes the potential inequities with privacy and data protection for at-risk communities.

Candace Brown

Secondary ELA

Candace Brown is a Secondary English and Yearbook teacher at an international Christian school in Taiwan. She has been published in literary magazines with the University of Arkansas, The Sagebrush Review, and Sonder Midwest. She has helped students publish their...
Browse Articles by Candace

Discussion


Leave a Reply

Want to join the discussion? Feel free to contribute!