Digital Learning’s Pioneers Are Cautiously Optimistic

Digital Learning

Digital Learning’s Pioneers Are Cautiously Optimistic

By Marguerite McNeal     Jul 10, 2016

Digital Learning’s Pioneers Are Cautiously Optimistic

Is there life out there? It’s a question that’s existed since humans first turned their gaze toward the night sky. It’s also the driving force behind the work of thousands of students searching for signs of life beyond planet Earth. They’re looking for answers in rock formations in the Australian Outback; they’re analyzing data from a field of stars to see if conditions can support life. And they’re non-science majors taking an online class.

Welcome to Habitable Worlds, an introductory science course that Arizona State University professor Ariel Anbar has taught online for five years. It represents a category of edtech, called “digital courseware” by foundations and industry analysts, that’s changing the way online students learn and faculty teach. Small companies like Acrobatiq, OpenStax and Smart Sparrow—the platform behind HabWorlds—and edtech veterans including Pearson and McGraw-Hill are racing to to develop complete courses that faculty can implement immediately.

These solutions promise to broaden access to quality education, engaging learners who might not make it through impersonal introductory lectures. Students receive individual feedback on their progress and their learning experiences change based on their performance.

These neatly packaged courses also raise questions about the role of faculty, and who decides what students need to learn. The companies building courseware products are developing increasingly complex algorithms that track students’ progress and recommend next steps in their learning paths. Faculty, researchers and even edtech companies are wary that the technology could become a runaway train, one that uses student data to make decisions—for better or worse—about what and how students should learn.

Course of the Future

HabWorlds has become so popular that other universities are taking the entire course—the virtual fieldtrips, simulations, videos—and teaching it to their students.

Anbar, who used to teach HabWorlds in face-to-face classes, developed the online course as a way to make it more interactive and accessible to more students. “Online I can forge personal connections with students more readily, by email and chat board discussions,” he says. “I get more insight into students.”

While Anbar admits the online shift changed his role from a lecturer to more of a coach, he’s adamant that he—not technology—maintains ownership over the class. And he’s working with a network of faculty who are using HabWorlds to teach science on their campuses. They have access to Smart Sparrow’s authoring tools, which let them edit the course, and they share their tweaks and updates with each other.

Complete courses like HabWorlds are a far cry from the one-way lectures that often stand between incoming college students and a degree.

Introductory colleges classes, especially in math and English, have earned the nickname “gatekeepers.” Students are required to pass these courses to attain their degree, yet they’re often unprepared to succeed. Half of college students fail to pass algebra with a grade of C or above.

Re-taking classes or enrolling in remedial ones to catch up means a longer time to graduation and, in many cases, more student debt. That makes first-generation, low-income students especially vulnerable to dropping out.

Promises and Pitfalls

Complete digital courses offer a way to give students who might fall off track in large lecture classes more personal attention.

“We know that personalized and adaptive approaches can be effective,” says Norman Bier, director of Carnegie Mellon’s Open Learning Initiative (OLI), which develops adaptive courses that use open educational resources and provide student feedback. In theory these technologies are a way to create the one-to-one tutoring model at the heart of Bloom’s 2 Sigma Problem.

But Bier cautions, “Building these kinds of solutions is not quick and easy.”

He adds that these products are in early stages of development and that they need more grounding in learning science research. That consideration doesn’t always fit into a business plan. “The intellectual property considerations of running edtech startup means moving at a pace and direction that is not always well-suited to the close partnership that’s necessary for academic and research involvement,” Bier says.

The small amount of research to date on how adaptive learning technology impacts student outcomes is inconclusive. An April 2016 study from SRI Education found that students in adaptive courses performed slightly better on assessments than those in a traditional lecture. But there was no change in their odds of successfully completing a course.

Without opening learning platforms up to the research community, it’s hard to understand if and how they’re improving learning outcomes. That’s one reason colleges haven’t exactly embraced adaptive learning technology, says Casey Green, director of The Campus Computing Project, an annual survey about the use of technology in higher education.

“A lot of the conversation about adaptive learning technology is still on the basis of opinion and epiphany as opposed to evidence,” Green says. “Where’s the evidence that this makes a difference?”

In the Campus Computing Project 2015 survey of more than 400 higher-ed CIOs and senior IT officers, 96 percent said that “adaptive learning technology has great potential to improve learning outcomes for students.” But participants estimated that just 4 percent of their institutions’ developmental and general education classes use adaptive software.

Students give these course packages mixed reviews. In a medical sociology class at the University of Georgia that used adaptive software from OpenStax, students with good discipline were irked they had to change their study habits. Others said they appreciated the nudges from the software to study in small chunks.

A Word of Caution

The algorithms built into adaptive learning solutions determine what content students see and how they perform on assignments. And like any algorithm, those in adaptive software come with biases. They reflect what the creators think students should learn, and how students can learn it most efficiently.

George Siemens, a researcher and digital learning theorist, has a problem with that. “We have to build adaptive learners not adaptive learning,” he says. Adaptive technology can help students master content more quickly, but what if they don’t need to know that material to succeed in school, work and life?

“If you have adaptive learning that takes all the choices away from the student, you’re stripping them of all of those skills that they need to function in today’s environment—critical thinking, goal setting, planning. Instead they click a button and get the next thing that pops up.”

Siemens thinks universities should have more control over how these solutions use their students’ data to shape learning experiences. “I’m genuinely concerned that university leadership is not sufficiently informed to be able to make the kinds of learning analytics decisions that they’re making when they’re buying products.”

It’s not too late for university leaders and faculty to play a more active role in developing the kinds of adaptive technologies they’re beginning to pilot. Siemens says that, before purchasing learning platforms, they should demand transparency into what kind of student data will be collected and how it will be used.

Students have a role to play, too, he adds. “I’d like to see more campus activism around data—students asking, ‘What are you doing with my data? Can I trust what’s happening with it? How am I being sorted?’”

At ASU, Anbar is interested in using the data that adaptive platforms capture to change the assessment model. Adaptive solutions have the ability to capture all kinds of student behaviors—how long they spend on activities, whether they use supplemental materials, what time of day they study. That’s information that Anbar says could better reflect student learning than tests and quizzes. “We aspire to teach reasoning and critical thinking, yet we’re still assessing based on content knowledge. Software can capture patterns of student thought. We could use that information to measure students’ progress and effort in a much richer way.”

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up