Teens Need Parent Permission to Use ChatGPT. Could That Slow Its Use in...

Artificial Intelligence

Teens Need Parent Permission to Use ChatGPT. Could That Slow Its Use in Schools?

By Jeffrey R. Young     Nov 2, 2023

Teens Need Parent Permission to Use ChatGPT. Could That Slow Its Use in Schools?

This article is part of the guide: For Education, ChatGPT Holds Promise — and Creates Problems.

Since the release of ChatGPT nearly a year ago, teachers have debated whether to ban the tool (over fears that students will use it to cheat) or embrace it as a teaching aid (arguing that the tool could boost learning and will become key in the workplace).

But most students at K-12 schools are not old enough to use ChatGPT without permission from a parent or guardian, according to the tool’s own rules.

When OpenAI released a new FAQ for educators in September, one detail surprised some observers. It stated that kids under 13 are not allowed to sign up (which is pretty typical, in compliance with federal privacy laws for young children), but it also went on to state that “users between the ages of 13 and 18 must have parental or guardian permission to use the platform.”

That means most students in U.S. middle and high schools can’t even try ChatGPT without a parent sign-off, even if their schools or teachers want to embrace the technology.

“In my eighteen years of working in education … I’ve never encountered a platform that requires such a bizarre consent letter,” wrote Tony DePrato, chief information officer at St. Andrew's Episcopal School in Mississippi, in an essay earlier this year.

In a follow-up interview this week, DePrato noted that one likely reason for the unusual policy is that “the data in OpenAI cannot easily be filtered or monitored yet, so what choice do they have?” He added that many schools have policies requiring them to filter or monitor information seen by students to block foul language, age-restricted images and video or material that might violate copyright.

To Derek Newton, a journalist who writes a newsletter about academic integrity, the policy seems like an effort by OpenAI to dodge concerns that many students use ChatGPT to cheat on assignments.

“It seems like their only reference to academic integrity is buried under a parental consent clause,” he told EdSurge.

He points to a section of the OpenAI FAQ that notes: “We also understand that some students may have used these tools for assignments without disclosing their use of AI. In addition to potentially violating school honor codes, such cases may be against our terms of use.”

Newton argues that the document ends up giving little concrete guidance to educators who teach students who aren’t minors (like, say, most college students) how to combat the use of ChatGPT for cheating. That’s especially true since the document goes on to note that tools designed to detect whether an assignment has been written by a bot have been proven ineffective or, worse, prone to falsely accusing students who did write their own assignments. As the company’s own FAQ says: “Even if these tools could accurately identify AI-generated content (which they cannot yet), students can make small edits to evade detection.”

EdSurge reached out to OpenAI for comment. Niko Felix, a spokesperson for OpenAI, said in an email that “our audience is broader than just edtech, which is why we consider requiring parental consent for 13-17 year olds as a best practice.”

Felix pointed to resources the company created for educators to use the tool effectively, including a guide with sample prompts. He said officials were not available for an interview by press time.

ChatGPT does not check whether users between the ages of 13 and 17 have the obtained permission of their parents, Felix confirmed.

Not everyone thinks requiring parental consent for minors to use AI tools is a bad idea.

“I actually think it’s good advice until we have a better understanding of how this AI is actually going to be affecting our children,” says James Diamond, an assistant professor of education and faculty lead of the Digital Age Learning and Educational Technology program at Johns Hopkins University. “I’m a proponent of having younger students using the tool with someone in a position to guide them — either with a teacher or someone at home.”

Since the rise of ChatGPT, plenty of other tech giants have released similar AI chatbots of their own. And some of those tools don’t allow minors to use them at all.

Google’s Bard, for instance, is off limits to minors. “To use Bard, you must be 18 or over,” says its FAQ, adding that “You can’t access Bard with a Google Account managed by Family Link or with a Google Workspace for Education account designated as under the age of 18.”

Regardless of such stated rules, however, teenagers seem to be using the AI tools.

A recent survey by the financial research firm Piper Sandler found that 40 percent of teenagers reported using ChatGPT in the past six months — and plenty are likely doing so without asking any grown-up’s permission.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

Next Up

For Education, ChatGPT Holds Promise — and Creates Problems

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up