Privacy push must not prevent personalized learning

By: and

Jun 18, 2015

We can all remember getting tests back with a big grade at the top and “X’s” marked next to the problems we got wrong. If we were lucky, a teacher, tutor, or parent was willing to go over those mistakes with us and fill in the gaps where we had struggled. If we weren’t so lucky, or frankly, if we had no real incentive to do so, we probably shoved the test into our backpacks and moved on to trying to get a handle on the next batch of material on which we would be tested in the coming weeks.

This often-broken cycle of addressing what students don’t know is not the fault of the teacher or student. Classrooms simply were not designed to maximize individual student understanding with the precision for which we might hope.

Education technology, however, is shifting what’s possible—and the benefits for students and teachers could be tremendous, so long as recent attempts to protect student privacy—itself an important goal—don’t get in the way.

Using online learning, assessment and data analysis, we can more precisely pinpoint what students know and where they are still struggling at the moment they are struggling. We can then use that information to drive better learning.

Teach to One, New Classrooms’ instructional model offering for math, showcases the opportunity. Central to the Teach to One model is an online program that helps create an individualized learning experience for each student. The software algorithm identifies exactly what a student does and doesn’t understand on a daily basis and directs each student to online and offline experiences that meet his needs. Teachers in a Teach to One classroom have up-to-date information on how every student is progressing through the material and can act accordingly.

Software innovations like these stand to offer far more granular and accurate information than can a weekly or even monthly pen and paper test that teachers grade by hand. Blended learning, the mix of online learning in brick-and-mortar schools, can shift how teachers allocate their time by allowing them to work actually with students based on individual students’ needs, rather than simply lecturing to an entire class that may have vastly different levels of understanding. Linda Howard, for example, a sixth-grade English teacher at Morton Middle School in Fall River, Mass., describes her experience teaching in a model in which students rotate between working online and face-to-face learning: “I get to work with small groups a lot more. I understand my kids so much better now. Working with them individually and having their data from i-Ready [one of the school’s digital content providers] has really opened my eyes to each kid’s strengths and weaknesses.” Howard is not alone. In a recent survey by the Bill & Melinda Gates Foundation, the majority of teachers reported that they believe that data and digital tools make them better teachers. In that same survey, nearly seven in 10 teachers (69 percent) surveyed believe that tailoring instruction to meet the needs of individual students is required to improve student achievement.

But some student privacy policies under consideration in Congress—and in numerous state capitals—threaten to stymie these efforts. Although teachers would still be able to make use of digital tools, new privacy laws could place onerous reporting and disclosure requirements on technology vendors regardless of their size, as well as restrictions on people’s ability to study tools’ effectiveness over time and vendors’ own ability to evolve their products based on student performance data. This in turn stands to have a chilling effect on the boon of education technology entrepreneurship and private investment currently bringing new tools into classrooms.

The most restrictive of these bills, which Sen. David Vitter (R-La.) introduced recently, would dramatically constrain teachers’ and administrators’ ability to use technology to personalize learning. The bill threatens to limit the creation of state longitudinal data systems that could show student growth over time, the use of so-called “psychological testing” (which, in this case, means assessments of non-cognitive or social and emotional learning), and the use of personal computing devices and video in classrooms (including for teacher evaluation).

The fierce debate about student privacy often risks failing to ensure that all students benefit from the enormous breakthroughs that technology makes possible in 21st-century schools. Fears that data will be leaked—or, worse, sold—are real concerns with which all of the bills under consideration grapple. Companies should do their share to ward off these concerns. Rather than simply comply with congressional requirements around privacy, EdTech companies ought to lead the way by proactively adding more security features and better privacy policies in order to win the trust of students, parents, and teachers.

But we should be cautious about myopic legislation when it comes to affording teachers and students the ability to reap the incredible benefits that technology can bring in helping all students learn. We certainly don’t want bad actors to know what our students do and don’t know, but we desperately need to arm teachers with the tools that allow them to.

Michael is a co-founder and distinguished fellow at the Clayton Christensen Institute. He currently serves as Chairman of the Clayton Christensen Institute and works as a senior strategist at Guild Education.

Julia is the director of education research at the Clayton Christensen Institute. Her work aims to educate policymakers and community leaders on the power of disruptive innovation in the K-12 and higher education spheres. Be sure to check out her book, "Who You Know: Unlocking Innovations That Expand Students' Networks" https://amzn.to/2RIqwOk.