The Hechinger Report is a national nonprofit newsroom that reports on one topic: education. Sign up for our weekly newsletters to get stories like this delivered directly to your inbox. Consider supporting our stories and becoming a member today.

Colleges and universities that use sophisticated data systems to analyze and guide students must be careful to avoid ethical dilemmas that can arise, like pigeonholing students unfairly or underestimating their abilities, according to a new report that analyzed common pitfalls of these systems.

These systems, known as predictive analytics, have the potential to reinforce disparities for disadvantaged students if used without proper training and consideration for how computers might shuffle students into categories, according to the report from New America, a Washington, D.C.-based think tank. The report was released this week at SXSWedu, a national education technology conference in Austin, Texas.

“We should always try to balance the potential of technology with its risks,” said Manuela Ekowo, an education policy analyst at New America, who co-authored the report with Iris Palmer, a senior policy analyst. “We shouldn’t let technology blindfold us to the ways it can do harm, particularly to students from underserved and under-represented backgrounds.”

Predictive analytics uses data to help inform instruction or academic guidance for students. It takes past performance, such as grades or other data points thought to be correlated with academic success, and attempts to distill that data into reports that can help ensure a student stays on track for graduation. It is a growing trend in higher education but it’s not ubiquitous. A report last fall from the same authors estimated that less than half of higher education institutions surveyed use data in this way.

Some say these methods can help improve student outcomes. Georgia State University, for instance, uses this type of a program to help more students from disadvantaged backgrounds, who have a greater tendency to drop out, and steer them toward success, according to the report. Administrators there use data to alert them to pressure points for students. For instance, the system posts an alert if a student flunks a course or does not complete a required class on time.  And, significantly, the university hired 42 new advisors – human beings, not the computer version – and reported that in one year they used data to schedule 43,000 in-person meetings between students and advisors.

“You need to train people who are using the data so they know these are not predictions – especially when we call it predictive analytics. Really, it’s misnamed.”

In other words, a key to the success story was human intervention, not a computer cure-all.

Even the best-intended programs can fail if educational intuitions do not consider, and plan for, ethical dilemmas, according to the new report. Schools might use data to justify enrolling more affluent students and fewer poor students, for example, because people from poorer families tend to face more challenges that make them less likely to graduate on time.

“You have to make sure you are not baking in historic inequalities,” Palmer said.

The authors suggest five steps that schools should take as they roll out these programs. Among the suggestions: Train staff, and include students in the decision-making process, giving them access to their own data and training them to use it.

Often, students are not given access to their own data or the systems that predict outcomes. Some students say they worry that they will be unfairly sidelined if a computerized system says they are not cut out for college work.

Students are key to ensuring the systems in use are accurate. They must be empowered to challenge the system, Ekowo said. And faculty and staff need to know that the predictions are correlations, not causations, Palmer said. The data is a guide, not a map that must be followed to its conclusion.

“You need to train people who are using the data so they know these are not predictions – especially when we call it predictive analytics,” she said. “Really, it’s misnamed.”

This story was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Read more about Blended Learning.

The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn't mean it's free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

Join us today.

Letters to the Editor

At The Hechinger Report, we publish thoughtful letters from readers that contribute to the ongoing discussion about the education topics we cover. Please read our guidelines for more information. We will not consider letters that do not contain a full name and valid email address. You may submit news tips or ideas here without a full name, but not letters.

By submitting your name, you grant us permission to publish it with your letter. We will never publish your email address. You must fill out all fields to submit a letter.

Your email address will not be published. Required fields are marked *