Automated Proctors Watch Students. Now Senators Are Watching These...

Policy and Government

Automated Proctors Watch Students. Now Senators Are Watching These Companies.

By Tony Wan     Dec 8, 2020

Automated Proctors Watch Students. Now Senators Are Watching These Companies.

Companies offering proctoring tools that monitor students as they take online exams are now being watched themselves—by Democratic senators.

Led by U.S. Senator Richard Blumenthal (D-CT), a group of six Democratic senators sent letters last week to three proctoring companies—ExamSoft, Proctorio and ProctorU—inquiring about the technologies they use to monitor users, how they ensure accuracy and what steps they take to protect students’ privacy.

Their concerns come amid a growing number of reports that these tools, which increasingly rely on algorithms and automated technology, not only wrongly accuse students of cheating, but also discriminate against certain students based on their skin color, facial attire and appearance.

“As we have seen far too often, students have run head on into the shortcomings of these technologies—shortcomings that fall heavily on vulnerable communities and perpetuate discriminatory biases,” the senators write. They add: “It is critical that bias, including racial and gender disparities, be addressed expeditiously to ensure that our students of color are not facing additional barriers in their fields.”

Along with Blumenthal, the letter’s signatories include Cory Booker (D-NJ), Chris Van Hollen (D-MD), Tina Smith (D-MN), Elizabeth Warren (D-MA) and Ron Wyden (D-OR).

“We are concerned that the software has not been designed to be inclusive and mindful of all students’ needs and proctors are not getting the training or information they need to adequately work with and oversee students taking the exams,” the senators write.

A New York Times article referenced in the letters details accounts from students who say the identification systems used in proctoring tools had trouble recognizing their darker skin tone (thus blocking access to take the exam), and penalized those with special needs who request and receive accommodations.

Other outlets have shared worries from students and privacy advocates over how much information these tools are collecting, through features like eye-tracking technology and requirements that students show their entire room and provide personal information during sessions which are often recorded.

“While all this information can be useful for maintaining integrity in testing and ensuring that student needs are being met, questions remain about where and how this data is being used before, during, and after tests, by both your company, the virtual proctors, and testing administrators. Students relying on your software to further their education have put a great deal of trust in you to reserve their privacy. You must be able to demonstrate that you are respecting students’ privacy,” the letter reads.

The remote proctoring industry has been around for decades. But as companies shift away from relying on humans who watch test-takers through webcams to algorithms and artificial intelligence, they have attracted scrutiny from those who point to studies showing that general facial-recognition software can be rife with bias and false positives.

Still, business is booming as more schools and colleges are relying on these tools as so many students attend class and take tests from home during the pandemic. Proctoring companies say they have seen usage skyrocket this year, although this has been accompanied by growing outcry. More than 60,000 U.S. students nationwide have petitioned their colleges to stop using Proctorio, an earlier EdSurge report found.

Developers of proctoring tools say that despite relying on artificial intelligence to flag suspected instances of cheating, it is ultimately up to a human, usually the instructor, to review and make the final call.

“Ultimately, edtech vendors and school leaders should ask not just ‘Can we do this?’ but also ‘Should we do this?’ said Cody Venzke, a policy counsel for the Student Privacy Project at the Center for Democracy and Technology, in an email interview. “Use of technology such as remote proctoring software and facial recognition requires thinking about ethical data use by engaging students and their families, limiting the types of data gathered and its subsequent use and retention, and ensuring that it does not have a disparate impact on students of color, students with disabilities, and transgender students,” he added.

In their letter, the senators ask the three proctoring companies to outline what steps they have taken to: ensure the accuracy of their tools; accommodate individuals with special needs; train human proctors; comply with student privacy laws; and respond to student complaints, among other questions. They asked for responses by December 17.

This is the latest Senate inquiry into the data collection practices of education technology companies. Last year, Blumenthal also co-authored a letter sent to over 50 companies inquiring about the data they collect and how that information is safeguarded or shared.

While these inquiries raise public awareness of companies’ practices, Venzke added that “it’s important to remember that the law is the floor, not the ceiling. Although certain uses of data and technology may be legal, they might not be in the interest of students’ safety and well-being, which is ultimately what we should be concerned about.”

“We look forward to cooperating with the Senators’ inquiry, and we appreciate the opportunity to share information with policymakers about the important role we play in helping millions of students and professionals meet their learning goals,” ProctorU CEO Scott McFarland said in a statement to EdSurge.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up