What Should K-12 Students Know About Artificial Intelligence? New Guidelines Are in the Works

Managing Editor

Artificial intelligence was a big theme at this year’s ASU+GSV Summit, where many ed-tech companies touted AI’s potential to transform schools and society.

At least some of them boasted that they’re already using it—or that they’re on the cusp of doing so.

While the ed-tech sector’s interest in AI is no surprise, it’s important to recognize that in the K-12 landscape, discussions of the fast-emerging technology are generally playing out on two very different tracks.

The first—and probably the one of great interest to businesses—is how AI might transform the work of schools in areas such as grading, administrative tasks, assessment, and even instruction.

The second track is focused on AI’s role in K-12 classrooms: How should teachers talk about the technology with students? What should students know about AI—basically, the process of developing machines to function intelligently, like humans—in order to be ready to deal with it in the workforce and in their daily lives?

It’s this second area that is the focus of Christina Gardner-McCune. She’s the co-chair of AI4K12, an initiative that is developing guidelines for how artificial intelligence should be taught in schools.

Gardner-McCune, who is an assistant professor in the computer and information science and engineering department at the University of Florida, spoke on a panel at ASU+GSV about what an AI-focused curriculum in schools should look like. ASU+GSV, which was held last week in San Diego, has become a major gathering of education company officials, entrepreneurs, investors, philanthropists, and others.Christina Gartner-McCune

AI4K12 is sponsored jointly by the Association for the Advancement of Artificial Intelligence, a nonprofit that seeks to build understanding of AI, and the Computer Science Teachers Association. AI4K12 has received financial support through a grant from the National Science Foundation.

For many students today, “AI is this black box, it’s this mystery or superpower,” said Gardner-McCune, in an interview at the conference.

“Our goal is to explain ‘What is artificial intelligence?’ How does it work, and what can I do with it?”

AI4K12 is crafting guidelines designed to work across different grade spans throughout K-12. The initiative says that the standards will be built around five big ideas that students should learn about in AI. Gardner-McCune described them this way:

  • Perception. How do computers see and make sense of the world outside? What happens when a device—such as a cell phone—uses facial recognition? Or when another tool uses object-recognition?
  • Representation and reasoning. How does AI “think?” How does it make sense of problems? Students should come away with an understanding of how AI creates representations of the world.
  • Learning. How do computers learn? They learn from receiving massive amounts of data, and so what they learn depends on what’s in the data, and what decisions a designer or developer is actually asking the computer to make. This topic can lead to conversations around bias and ethics, says Gardner-McCune. How is data being collected, and should it be collected at all. Who does it benefit or disadvantage?
  • Natural interactions. How do humans design AI so they can interact with it? How can concepts that humans understand—about social distance and motion, for instance—be incorporated with AI tools and platforms?
  • Social impacts and ethics. This focuses on helping students understand the ethical and societal implications of AI, and the implications of how those systems are designed. It also looks at questions about how data is used and collected, the transparency of AI systems, and “how do we work with AI as an assistive technology?” said Gardner-McCune.

The process for developing AI standards — the authors say they are more accurately described as guidelines — is expected to last two years, she said. In October of last year, AI4K12 released a draft of its big ideas. The organization hopes to release a portion of its work supporting the big ideas for public review at the ISTE and CSTA conferences this summer. More work and public review will follow, with the goal of unveiling a full set of vetted guidelines by June of 2020.

Over roughly the past year, AI4K12 has sought input on the development of standards from a variety of sources in the K-12 community, industry, government, and other areas, Gardner-McCune said.  The organization also formed a working group, made up of 16 K-12 teachers across four different grade spans (K-2, 3-5, 6-8, and 9-12) and AI experts in industry and academia to help it develop the guidelines, Gardner-McCune explained.

In addition, the organization has sought input on the process and standards at AI and education conferences, and it has reached out to teachers about how to integrate AI topics across the K-12 curriculum, she said.

AI Literacy

One of the biggest questions about teaching AI in K-12 schools is where it would fit during the school day.

Gardner-McCune says that the AI4K12 wants to see the topic woven into lessons across different grade spans, in ways that are useful for students of different ages.

In early grades, a big goal is to help students understand what AI is in relation to their own senses, and their own ways of making sense of problems. They can also grasp the forms of AI they’re already connecting with, such as Apple’s Siri, or Amazon’s Alexa—and that they’re programs, not actual people responding to them, suggested Gardner-McCune.

By late elementary and middle school, students can begin early experiments in building AI-like technology and applications that are of interest to them, she said.

“We’re imagining it’s going to be integrated across the curriculum,” she said. “Once you get into high schools, we’d love for it to be integrated, but there’s also the opportunity to have dedicated classes [focused on it], as well.”

The rise of AI has stoked a lot of fear about whether the technology will end up displacing workers whose skills are made obsolete by machines.

The unease extends to K-12 schools, where some educators worry that AI could undermine traditional instruction, if schools were to seize on tech-based strategies for lessening their reliance on teachers.

Those questions aside, Gardner-McCune says the goal of schools and educators should be to create “AI-literate citizens,” who can reason about how the technology should be used.

That means understanding AI is about more than just “machine learning,” she said, and that it’s not about the dystopian view of “killer robots.”

The responsibility for designing AI systems that improve society, rather than doing it harm, falls on human designers of the technology, said Gardner-McCune. Creating good standards can ensure today’s students are up to the task.

“When you create an AI artifact—even something as small as a chatbot…You realize there’s a lot of choice people have in making those decisions,” she said, “and you realize that computers are still kind of dumb.

“It’s a risky thing to say. But they still take a lot of decisionmaking on the part of designers and developers to actually create the technology we’re using.”

This post has been updated. The backers of the AI4K12 effort describe its work as an effort to create “guidelines” for how to teach AI in schools, rather than academic standards.

Photo of Christina Gartner-McCune at ASU+GSV, April 2019, by Sean Cavanagh.


See also:

 

7 thoughts on “What Should K-12 Students Know About Artificial Intelligence? New Guidelines Are in the Works

  1. Why is it risky that this there is still a lot of decisionmaking left? Because companies don’t want children to realize this is all a matter of choice? That AI is another tech product that people want to sell? I’m glad there are some questions built in here, for example, on whether “data should be collected at all.” There are, however, far too many assumptions being taught about AI as a necessary future.

Leave a Reply