EduStar Platform Promises Quick, Randomized Ed-Tech Trials

Associate Editor

EduStar, a platform that conducts randomized-controlled online trials “in the background” as K-12 students use different apps, videos or games in classes, has run 77 trials engaging more than 10,000 students so far, with plans to stage many more, its co-founders say.

The technology platform is the product of a grant-funded collaboration between two professors and the nonprofit PowerMyLearning.

It runs trials on granular pieces of digital learning activities via the PowerMyLearning Connect platform, which is available free to schools.

This week, the professors—Aaron Chatterji of Duke University and Benjamin Jones of Northwestern University—discussed their results in a presentation at the Brookings Institution, as part of The Hamilton Project, which is housed at Brookings. The academics highlighted two of the trials, and made a case for EduStar to be used to evaluate K-12 apps, videos, and games using Consumer Reports-like ratings.

The researchers say the project is at an inflection point where it can be scaled up to do more trials in larger districts, at the same time they involve more education researchers and attract content developers to the EduStar platform.

“For companies that are interested in testing their products or doing research on granular content, we’ve now built a research engine that can be used for that,” said Elisabeth Stock, CEO and co-founder of PowerMyLearning, in an interview. She cautioned that all the details about how and when those companies can come on board have yet to be ironed out, because decisions have not been made about how to commercialize it.

The PowerMyLearning Connect platform, accessed by 40 partner schools, gets 8 million page views per month. It allows K-12 teachers to assign playlists of learning activities to their students, and allows students to direct their own instruction. Parents can also access it for use by their children.

“The reason we picked granular content—videos that might be two-and-a-half minutes long, short games, interactives, or simulations—is because there’s been big interest in granular content” from schools, districts and charter management organizations that want to see playlists with different learning activities, Stock said. For teachers, the EduStar trials can be run during a class period, over a short time.

What The Trials ShowedEDUSTAR-Hamilton-Project-Brookings

Chatterji, who is an associate professor  at The Fuqua School of Business at Duke University, summarized the trials in a presentation of  “Learning What Works in Educational Technology with a Case Study of EDUSTAR,” a report released by The Hamilton Project.  Jones, who is a professor at the Kellogg Innovation and Entrepreneurship Initiative at Northwestern University, answered questions on the panel. (View their discussion at the link below.)

“Our guiding principles are that the evaluation has to be rigorous, continuous, conducted by a trusted adviser and built on an existing platform,” Chatterji said.

“We do what Google and Amazon do every day, when they do something called A/B testing,” he said. Those online platforms will run comparisons of one group against another and a control group, without any of the groups being aware that they are part of a test.

For EduStar, the trials are set up so that, when an educator is ready to teach a particular skill, he or she can choose a “mission” from PowerMyLearning Connect. The students in his or her class are randomly assigned to three different groups. All take a brief pre-exercise quiz to determine their level of knowledge of that skill before beginning. Then, one set of students does a prescribed sequence of practice exercises and digital learning activities to learn the skill; another group of students is assigned a different activity or video with the practice exercises to teach the skill; and a third set of students—the control group—does not have any learning activity to do. All three groups are given a “post-activity” quiz to see what they have learned, even though one group had no instruction. (The control group then is assigned one of the digital learning missions after the trial is over, so that the students have the same opportunity to learn the skill as their peers.)

As the trials are underway, teachers can view reports that provide feedback as their students work through the various activities.

Basketball vs. Boredom: The Dividing Fractions Trial 

The first of the two trials asked whether sixth graders who are learning to divide fractions would do better with what Chatterji called “one boring digital learning activity … Dividing Fractions,” or another one that incorporates game-like features in a basketball theme that they gave the generic name “Basketball Dividing Fractions.”

“It turns out boring is better,” Chatterji said of the trial, which involved 544 student participants. In fact, students spent almost five minutes more on the basketball game, but they learned less. This gave the “boring” activity two points as a higher quality activity: it taught more, and in less time.

Video Tutorial Face-Off

Another test pitted two digital tutorials by LearnZillion against each other, in three grade-level math trials taken by 1,457 participants in the sixth, seventh, and eighth grades.

The question being examined was, “What happens when students are forewarned about common mistakes” designed to teach a particular math skill. The first simply explained to students how to do the skill; the second also pointed out common mistakes students make while learning the skill. In this trial, the digital tutorial that pointed out common mistakes was less effective than the one that just explained how to accomplish the skill.

Beyond giving the EduStar team data to share with LearnZillion, the trial also offers some insights that can be useful in classroom instruction, Stock said. “That’s what made it exciting for us,” she said. and its findings can be used to inform pedagogy, but it answers questions about pedagogy going forward.”

Findings from this trial would not be surprising to learning scientists, she added, but they might be to developers who have not consulted academic research before creating their educational apps.

Next Steps for EduStar

The development of the EduStar platform required significant planning and advisory work. The project has been underwritten by about $9 million in grants from the Laura and John Arnold Foundation, the Bloomberg Philanthropies/America Achieves, the Broad Foundation, and the Bill & Melinda Gates Foundation, according to Stock. (Education Week has received grant support from the Eli and Edythe Broad Foundation, as well as the Bill & Melinda Gates Foundation.)

Now, the partners are looking for EduStar 2.0. Chatterji and Jones said that they want to expand the user base to more schools and districts, as well as expanding and automating product evaluations, among other goals.

“Our vision is to continue providing EduStar as a free resource to schools and individual home users,” they said in their paper. Additional funds will be necessary to fulfill their vision.

“In the long term, though, revenue streams from software developers or researchers may make the platform self-sustaining,” they wrote. “We predict that, as EduStar expands in scope and automates more of its processing, the cost of RCTs [randomized-controlled trials] can fall to less than $1 per participant.”

You can see their presentation of the Hamilton Project paper here, beginning at the 9:55 point.


See also:


Follow @EdWeekMMolnar and @EdMarketBrief for the latest news on industry and innovation in education.

Leave a Reply