Automated surveillance spread during the pandemic. He’s likely to stay, despite worries
The most controversial new technological tool for colleges since the start of the pandemic is automated monitoring, which aims to detect cheating on online exams using algorithms that monitor students through their webcams and check for suspicious behavior, sending often clips of questionable moments to professors for later review.
In the past few months alone, a law student has sued an automated surveillance company, students have complained about their use in student newspaper editorials, and professors have compared them to Big Brother.
These complaints are in addition to previous reluctance which included petition campaigns that drowned tens of thousands of student signatures against the approach, a statement from the University of Michigan at Dearborn that the institution would not use automated monitoring tools and even a retreat by a monitoring company, ProctorU, which decided not to sell software that uses algorithms to detect cheating, although it still sells services that employ remote human monitors to to do work.
Despite all of this opposition and the fact that colleges are returning to in-person teaching, sales of monitoring software have been strong. A recent Educause study found that 63% of colleges and universities in the United States and Canada mention the use of remote monitoring on their websites.
And some analysts watching the edtech space expect colleges to continue signing up for the services to make them an option for faculty to use.
We are publishing a series on how pandemic-era practices continue to shape higher education. Check out our related article, “Pandemic Has Driven Colleges to Record Lectures. The Practice May Be Here to Stay.”
“As far as I know, the business is holding up,” says Trace Urdan, managing director of Tyton Partners, an educational consultancy and consultancy firm. “The story with a lot of edtech is that the pandemic has catalyzed a lot of growth, and adoption even holds up once on the ground. [teaching] he returns.”
One of the reasons colleges are clinging to monitoring tools, Urdan adds, is that many colleges plan to expand their online course offerings even after campus activities return to normal. And the pandemic has also seen the rapid growth of another tech trend: students using websites to cheat on exams.
“There is a lot of concern in higher education about Chegg and Course Hero,” says Urdan, “everyone recognizes as cheating tools.”
The makers of Chegg and Course Hero, for their part, argue that their services are not intended as cheating tools, and they point out acceptable use policies and other efforts that discourage cheating. But corporate marketing language promises easy answers to struggling students, and many students say they have a reputation as cheat helpers. The professors, meanwhile, blame these companies for starting an arms race that created the automated surveillance market in the first place.
Rethinking the test
Those who oppose automated surveillance cite several objections.
Some say the systems often lead to false positives, add stress to the testing process, and invade privacy. And darker skin tones can prove particularly tricky for algorithms, raising concerns about the fairness of the technology. Still others pointed out that savvy students can always find ways around spy software.
Controversy has led some professors to advocate the design of homework assignments that students have a harder time finding answers to online, such as project-based work. And others have worked to protect academic integrity without using oversight tools.
Professors from the University of Maryland in Baltimore County presented such an idea at the recent Educause edtech conference in Philadelphia.
They used a feature of the Blackboard Learning Management System to randomize exam questions in an introductory chemistry class.
“We randomized the students into four groups,” says Tara Carpenter, a UMBC teacher who taught the course. “We used the Blackboard settings to say that group 1 will start with [questions in] group A, ”she adds, noting that they had four sets of questions and that each group’s questions were asked in random order.
“We were trying to do everything we could so that if two students sat down together thinking they were going to take the exam at the same time, it wouldn’t help them at all,” she adds.
Despite all of these efforts, a few students have used Chegg to cheat, posting test questions on the site and having a paid expert answered (the site guarantees answers within half an hour, according to Carpenter).
“After each exam, we would check Chegg to see if anyone had posted,” she says, and when they found a couple, they applied to Chegg to unmask the identities of the students who posted the questions. . “Getting the information from Chegg takes a waiting period,” she adds. But she said they could often tell who posted the questions just by seeing which question was posted at that time. “We often found out who the cheater was before Chegg came back to us.”
Most of the students who used Chegg to cheat did so out of ‘desperation’ because they didn’t pass the course until the final, says Sarah Bass, another UMBC speaker who helped develop the exam. random chemistry. She points out that most students are honest, but instructors always want to make the process as fair as possible.
Carpenter agrees. “There is a mindset of some professors who think the default is that students want to cheat,” she says. “In reality, it is a very small fraction of students who intend to cheat based on my experience.”
Professors initially tried using remote monitoring software, adopting a system designed by Respondus that monitors students and activity and allows instructors to lock student browsers remotely so they cannot open other windows.
But they abandoned the approach when they found that many students couldn’t use the software because it wasn’t compatible with Chromebooks. And some students have complained about installing the software on their computers. “Students rightly have their own concerns about having to download and use this software on their personal devices,” Bass says.
The professors decided that it was worth the extra effort to avoid the monitoring software. “One of the things that we’re very passionate about is fairness for students,” says Carpenter.
One question is whether other teachers will make these efforts or choose the often easier remote software response.
At the University of Wisconsin at Madison, officials renewed their contract with a provider of automated monitors, even after more than 2,000 people on campus signed a petition calling for the technology to be banned on campus. A spokesperson for the university told the student newspaper that the number of professors using the tool has “drastically decreased” since the spring session.