Artificial intelligence is here, and it’s coming to classrooms everywhere sooner than you might think--here's why AI's flaws aren't fatal.

Why AI’s flaws won’t slow its adoption


Artificial intelligence is here, and it’s coming to classrooms everywhere sooner than you might think

Key points:

As the ‘23-’24 school year comes into focus, there is now little doubt we are on the brink of a major technological revolution that will affect our schools, our jobs, and our lives in ways we can’t yet fathom. By now, you’ve likely seen countless variations of “The Robots are Coming” headlines from clever copywriters hunting for clicks. But when is this tidal wave of innovation going to hit, and how can we be ready when it does?

There are legitimate concerns about early-stage AI, including:

  • Security and privacy
    Of all the issues, this one feels like the easiest to address. In short, if you’re working with a large language model (LLM) like ChatGPT and want to stay compliant with student data privacy laws, simply do not input any identifiable information about your student. Ask questions to confirm that any AI-integrated apps are keeping prompts and identifying data separate.  Assume anything that goes into your prompts can and will be seen by others.
  • Bias and discrimination
    There’s no getting around the fact that AI is biased. It’s biased because people are biased, the data it’s been trained on is biased, and its lack of nuance or self awareness means it has a hard time even identifying the possibility of bias.

    The deeper problem is that nobody will ever agree on how much (if any) moderation should be present. Heck, we can’t even agree on how to define the concept of “bias” to begin with. This is a problem without an easy answer, but the long and short of it is that AI is not any more biased than any other medium we or our kids are exposed to in the modern world. If we sit around waiting for “unbiased AI,” we’ll be waiting a long, long time. The onus is on developers to account for and mitigate this risk whenever possible. It’s also on the adults in the room to help our children better understand the concept of bias and how it can influence the media we consume, not just within the realm of AI.
  • Disinformation and Hallucinations
    LLMs have proven to be really good at some things: coding, writing essays, acing the bar exam, out-diagnosing licensed doctors, and so much more. One thing they’re not always great at is telling the truth. Whether it’s confusing the big and little hands of a clock or quoting non-existing case law, generative AI has a way of just making things up. For all the possibilities AI unlocks from a personalized learning standpoint, it’s probably not a good idea to use these tools to teach facts yet this school year.

It’s unfortunate that the first interaction most educators had with AI was in the context of cheating. When ChatGPT was released to the public in November 2022, kids wasted no time asking it to complete their writing assignments for them, and teachers quickly realized the output was indistinguishable from something a human could produce. Writing has always been assessed on the finished product, but if you can’t trust the source, what are you supposed to do?

For most, the initial knee-jerk reaction was to fight back against the technology, banning ChatGPT at the school and district levels, investing in “AI detectors” on top of plagiarism checkers, and questioning whether writing instruction was even going to be viable long-term. With time, however, most of those bans were walked back, the focus shifted to “how do we live alongside this technology,” and educators went back to the drawing board to brainstorm new ways to teach, practice, and assess writing in this new reality.

We’ve been down this road before. Calculators, PCs, the internet, Wikipedia, and interactive whiteboards all come immediately to mind as advancements that started off as polarizing, but are now so commonplace that we can’t fathom life without them. I’m old enough to remember the scandalous feeling of using CliffsNotes to cut down on reading homework. Now, kids can find 10 times as much information about any work in existence by tapping a simple search query into their phones.

There are, of course, many more reasons to be wary about AI, whether technical, cultural, or ethical. But progress is not going to stop just because we don’t have all the details ironed out right away. By the end of this school year, every major curriculum provider and edtech resource will incorporate some form of AI into their products. This is not just idle speculation—AI feels like the only topic anybody is talking about in technology summits, leadership conversations, and edtech conferences. The general consensus is that anybody who isn’t moving to adopt these technologies right now is already on the path to obsolescence.

If nothing else, the past few months have taught us that educators need more information on AI-powered tools, and they need it yesterday. Someday, we’ll likely have frameworks in place to vet, evaluate, and hold all app providers accountable just as we’ve done with student data privacy and evidence-based efficacy requirements, but we’re still years away from anything like that. In the meantime, the onus will fall on administrators to not only stay up to date on what’s happening in the world of AI, but also to help teachers become more comfortable with it.

Despite the many caution flags that come with any new technology, AI has the potential to make learning more accessible, more engaging, and more effective for every student. In these early stages, we’ll all need to be mindful of the pitfalls, make sure there’s a meaningful instructional purpose behind any implementation, and constantly monitor its use and outcomes. This year will be a learning experience for all of us–let’s lay the groundwork for a brighter future together.

Related:
4 ways to use ChatGPT for learning and creativity
How ChatGPT made my lessons more engaging

Sign up for our K-12 newsletter

Newsletter: Innovations in K12 Education
By submitting your information, you agree to our Terms & Conditions and Privacy Policy.

Latest posts by eSchool Media Contributors (see all)

Want to share a great resource? Let us know at submissions@eschoolmedia.com.

eSchool News uses cookies to improve your experience. Visit our Privacy Policy for more information.