The Hechinger Report is a national nonprofit newsroom that reports on one topic: education. Sign up for our weekly newsletters to get stories like this delivered directly to your inbox. Consider supporting our stories and becoming a member today.

Editor’s note: This story led off this week’s Future of Learning newsletter, which is delivered free to subscribers’ inboxes every other Wednesday with trends and top stories about education innovation.

Choose from our newsletters

While developers of artificial intelligence and industry leaders debate the risks and precise consequences of the technology, there’s no question that AI will greatly influence teaching and learning in the coming years.

Richard Culatta, CEO of the nonprofit International Society for Technology in Education, or ISTE, warns that if the education community sits on the sidelines as the technology is advancing and ethical concerns are navigated, it will be “the century’s biggest wasted opportunity.”

“In five years, we will have something that has been built without any input from teachers and without any shaping around the needs of education,” Culatta said.

In 2018, ISTE and General Motors launched a professional development course to train educators on how to use AI for teaching and learning. Culatta said he’s found educators are very excited about the opportunities and possibilities of using generative AI — a type of artificial intelligence technology with the ability to produce various types of content, including text, images, audio and synthetic data — in their classrooms. They just need context and training.

In the next two newsletters, I’ll be highlighting how educators and students are already engaging with new AI tools in and out of the classroom. This week I’m focusing on higher ed, and next time I’ll feature lessons from K-12.

“They’re learning about, ‘How do I get AI to replicate my work?’ And then ‘How do I take something the AI has produced, and personalize it to the work I’m trying to accomplish?’”

Richard Ross, an assistant professor of statistics at the University of Virginia

At the beginning of this past semester, Richard Ross, an assistant professor of statistics at the University of Virginia, attempted to write a thoughtful email to his students, introducing them to their courses. But as he read over it, he realized it came across as more rigid than he wanted it to be. So, Ross used a generative AI tool — his first experience with it — and prompted it to compose the email “in a kinder tone.”

“And it did that, and it did it so quickly that if I had thought to make some of these changes, I wouldn’t have done it nearly as fast,” Ross said. He didn’t end up using every word or sentence of the AI-written email, but it provided a template.

“The realization for me was this can be a valuable part of what we do,” said Ross. “There are some students who will greatly benefit from the information that this doesn’t replace all your steps, but it might simplify some things.”

This past semester, Ross incorporated generative AI into two of his classes in very different ways. For his class on mathematical statistics, Ross asked his students to research theorems, their inventors and explain how the theorems were proved — without the help of AI. Then, Ross asked students to exchange topics and this time he asked students to supplement their research using generative AI (he recommended BingAI). Students then had to decide whether the AI explanations were clearer and more in depth than the student-provided ones.

In his other class, an undergraduate course on data visualization, students worked together to create a basic web application using the platform R Shiny, a tool for building interactive web apps from code. Once students had manually created the app, they had to figure out how to prompt an AI tool to duplicate it. Students then worked backwards, writing code to make the AI-developed app more complex.

“They’re learning about, ‘How do I get AI to replicate my work?’ And then ‘How do I take something the AI has produced, and personalize it to the work I’m trying to accomplish?’” Ross said. He added it’s valuable for students to learn how to transfer original work to AI and adapt work created by AI code.

“It supports the notion that it’s a tool. It’s not a replacement for skill and coding or the ability to read and understand things,” Ross said.

According to Culatta, the method Ross is using to incorporate AI into his coursework is the most common way AI is being adopted in higher education. In the higher ed space right now, Culatta said, generative AI tools are mainly being used for research by both students and educators.

“Students don’t want a robot to teach them; they might use a robot to help them, but they don’t want AI to teach them.”

Richard Ross, an assistant professor of statistics at the University of Virginia

Students will need to know more about AI and how to use it as they graduate and go into the world of work and as generative AI advances and becomes more commonplace, he said.

Eric Wang, vice president of AI at Turnitin, a plagiarism detection software company used by many higher education institutions, said AI is already subtly steering what we do everyday, whether it’s our Netflix viewing habits or our auto-completed sentences in Gmail. He said that as tech and AI companies release more new tools and models, AI literacy is going to be a vital skill.

Wang said students will need to know how to talk to AI, command it to do certain things and put guardrails in place for its use.

“That’s a skill set. And I think there will come a day where that skill set is going to be as expected as understanding how to use a word processor,” Wang said.

While there are educators like Ross who are eager to introduce students to AI, many others remain skeptical of the tools, Culatta said. His advice: Teachers need more support from school leaders and others to understand how they can use the tools.

As for Ross, he plans to continue incorporating generative AI tools in his classroom. He reassures his peers — who worry about being replaced by technology — that there’s a lot AI can’t do, like interact with students in a nuanced and dynamic way.

“Learning how to use this tool isn’t going to replace instructors. It may demand that some instructors adapt,” Ross said. “But students don’t want a robot to teach them; they might use a robot to help them, but they don’t want AI to teach them.”

This story about teaching with AI was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for the Hechinger newsletter.

The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn't mean it's free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

Join us today.

Letters to the Editor

At The Hechinger Report, we publish thoughtful letters from readers that contribute to the ongoing discussion about the education topics we cover. Please read our guidelines for more information. We will not consider letters that do not contain a full name and valid email address. You may submit news tips or ideas here without a full name, but not letters.

By submitting your name, you grant us permission to publish it with your letter. We will never publish your email address. You must fill out all fields to submit a letter.

Your email address will not be published. Required fields are marked *