Inside an Adaptive-Courseware Experiment, Glitches and All

Digital Learning

Inside an Adaptive-Courseware Experiment, Glitches and All

By Jeffrey R. Young     Feb 7, 2017

Inside an Adaptive-Courseware Experiment, Glitches and All

This article is part of the guide: Community Colleges Point Toward New Directions in Digital Innovation.

Just before professor Barry Spieler enters the classroom at Montgomery College to teach an introductory statistics course, he looks at a data dashboard and gets a sinking feeling. He sees that few of the students in the course have bothered to do the interactive exercises they were assigned for homework.

He doesn’t sweat it, though, since he has a plan to get everyone back on track. In fact, he predicted that the changes he’s experimenting with this semester might be jarring to students accustomed to lecture classes. His course expects a “new workflow,” he explains, since it uses adaptive courseware software, as well as a flipped classroom model where lectures are replaced with the interactive material.

What the students don’t know is that the course is part of an ambitious experiment that includes community colleges, Maryland’s flagship university, the nonprofit education consultancy Ithaka S+R, and a group called Transforming Post-Secondary Education in Mathematics, or TPSE Math. The mission? To explore how adaptive software can help change teaching styles, impact completion rates, and create a new model for synchronizing the curriculum between two and four-year colleges. The students also don’t know that they are using a tool that suffered disruptive technical errors when it was first tried last year.

Spieler emotes a playful and easy-going manner as he divides the students into pairs for an in-class project. He jokes that he usually uses an app on his smartphone to randomly assign groups, since all human decisions involve bias. But he forgot his phone in his office, so he asks each student to pick a number between one and ten, and then he pairs them based on the numbers. His strategy is that, with any luck, at least one of the students in each group will have enough prior knowledge of statistics to do the in-class project even if no one in the group did the required homework.

The class takes place in a computer classroom, and he has given the class a spreadsheet with data from a restaurant where the owner kept track data from receipts involving three servers over the course of two weeks. Variables tracked included the amount of the total bill, the amount of the tip given the number of people in each party, and a few other details. Spieler asks each group to come up with three questions they might answer with the data set, relying on only one variable.

The professor only “lectures” occasionally—for just a few minutes at a time—to define terms that students seem fuzzy on or to quickly show how to make charts on the statistics tool they’re using. Mostly he’s asking questions and listening to reports from each group.

“I think we waste a lot of class time talking at people,” he says of the traditional lecture model. “We make the mistake of thinking that what we say is what they learn,” he adds. “I’m looking for different ways of making them engage with things in class.”

‘At Your Own Pace’

The students who actually did Spieler’s homework experienced another trendy teaching approach: adaptive learning software. The Maryland project is working with Acrobatiq, a company that spun out of Carnegie Mellon University’s much-watched Open Learning Initiative.

The company’s introductory statistics courseware, which professors involved in the project made some customizations to, is in many ways an interactive textbook. One key difference, says Acrobatiq CEO Eric Frank, is that it focuses on presenting complex problems and giving a rich environment for students to show their work within the system. “They’re in a much more learn-by-doing mode than you will typically find in textbooks or the latest generation of interactive textbooks that are out there,” he says. The software is also designed to help students along if they get stuck, and present different feedback to different students based on their behaviors. As a result, the data the software can send to a professor’s dashboard is richer than just revealing how much time students spent—it can give a pretty clear sense of which concepts students seemed to master (or not).”

One of the students in the course, 19-year old David Boakye, says he likes the software because thinks it saves him time compared to listening to lectures or reading a textbook. “Working at your own pace, it makes things go a lot quicker, and in some ways you learn it a lot better,” he says. This isn’t a totally new experience for Boakye, who says he took a self-paced math course from the college last year.

Another student, Steve Moore, says he particularly likes that students are required to go through the courseware before coming to class, so that he can use his class time to make sure he understands the material and have the professor on hand to answer questions. He doesn’t know whether that flipped approach would work as well for him in other subjects, though.

Spieler says that flipping the classroom fits his personality and teaching style. (Lately he’s also been studying techniques of inquiry based learning, some of which use no technology at all). But he admits that the approach may not work for all professors, some of whom may be masters at engaging groups from the front of the room.

In fact, the group coordinating the teaching experiment, Ithaka S+R, made flipping the class optional for participating professors and colleges. In some courses trying out the statistics courseware, for example, professors still lecture as usual; the Acrobatiq material is assigned for homework afterword, in place of a textbook and problem sets. Still other courses in the pilot use an emporium model—meaning that students work through the courseware during class time, while the professor roams the room answering any questions that come up.

John Hamman, dean of mathematics and statistics at Montgomery College, a two-year college in suburban Washington, DC, says he doesn’t want to dictate teaching styles, and giving professors that choice makes adoption easier. The courseware could also save professors time, since it automatically grades assignments. These days, he adds, some students prefer that auto-grading as well—at least for courses in statistics. “When a computer is grading your work there is a certain amount of belief in the fairness of that,” he says, “and a much stronger belief that you can succeed.”

For Hamman, the biggest selling point to get involved with the project was the chance to sit down with counterparts at the University of Maryland at College Park, which has committed to trying the same courseware.

“What I like is there’s some real standard as far as learning objectives,” he says. Since many students at Montgomery College plan to transfer to College Park, “knowing that our students are seeing same thing native to the University of Maryland is really helpful.”

That coordination was also a selling point for Brit Kirwan, executive director of TPSE Math and the former chancellor of the University of Maryland system. And in the long run he sees the technology as a key strategic move for colleges and universities. “Adaptive learning—drawing on intelligent software—I really feel has the potential to both address some of the cost issues but also improve completion rates,” he says.

But even Kirwan has some reservations. “I don’t have 100 percent confidence that we’re there yet,” he says, “that’s why we’re running this experiment.”

Will It Work?

The project has suffered a few set-backs. When the participating courses first used the Acrobatiq courseware, they found several glitches in the first couple weeks that frustrated students and professors alike.

For Spieler, the professor at Montgomery College, the issues were serious enough that he sometimes used alternative assignments, and he felt he couldn’t trust the information in the data dashboard.

Acrobatiq’s Frank says the problems were not with the artificial-intelligence engine that makes the software unique. Instead, the issues were more basic—several charts and tables wouldn’t display, making it impossible for students to complete the assignment, and when students tried the courseware on smartphones, they were not given a second chance on problems, as was intended.

“We were on it immediately, and it was sort of an intensive effort to fix it,” he says. Still, the issues persisted for two weeks. And since those were the first two weeks of the class, says Spieler, it was very disruptive.

Frank says they haven’t had further issues. “Even though students were upset, when asked if they would like to use this kind of product again, the overwhelming response was favorable,” he says, citing surveys given at the end of last semester.

Spieler and Hamman say they are now confident the software is working. And they’re hoping that this semester will give them more-robust data on how well the courseware is working for students.

Hamman says his initial review suggests that students using Acrobatiq are doing at least as well as those in control group courses that still rely on a textbook. In some cases, the students using the adaptive courseware earned slightly better grades.

Kirwan says he doesn’t expect the software will replace professors, simply change the way they teach. “The real cost savings we’re looking at in this stage is the improved outcomes,” he says, such as retention. “The goal is to improve outcomes, which will ultimately save the institutions money.”

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

Next Up

Community Colleges Point Toward New Directions in Digital Innovation

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up