The COVID-19 pandemic changed the landscape of education in a matter of weeks when nearly every student in the US was forced to transition to emergency remote learning (ERL). Schools did the best they could to ensure children had access to lessons, like providing laptops and broadband internet for homes without them, introducing dozens of new educational technology (EdTech) tools to engage and track student progress, and even giving out teacher’s personal phone numbers. Policies shifted, sometimes daily, to address new issues that arose from all the changes. Unfortunately in the wake of a full-blown crisis, important safety measures for privacy and security took a backseat to just making sure schools could function.
Researchers from the AIR lab at the University of Chicago’s Department of Computer Science, led by associate professor Marshini Chetty, and the University of Maryland took the opportunity to study how remote learning impacted teachers, parents, and PreK-6 students in regards to privacy and security. The research team’s goals are to better prepare decision makers to tackle these problems before another crisis happens again. Through a series of 29 interviews with teachers and parents, they were able to identify key tensions and breakdowns that negatively influenced the remote learning environment during the pandemic. Their paper detailing the qualitative research will be presented at the 26th ACM Conference on Computer-Supported Cooperative Work and Social Computing, a top venue for social computing scholarship.
One of the focal points of the paper is how privacy and security concerns surrounding emergency remote learning are not purely a technical issue, but rather grounded in a “contingent sociotechnical system,” a term the paper contributes and defines as an assemblage of technologies and social actors that exist in a particular unforeseen state due to a crisis or emergency event.
“We want to look at both the social and technical aspects of privacy and security,” said third-year PhD student and lead researcher Kelly Wagman. “A lot of prior work in computer science surrounding topics of privacy and security tends to be very technical, where they are just looking at it as a technological problem. Here we wanted to highlight how it’s actually this really intertwined, messy problem, where we need to look at improving both sides if we are going to make a difference.”
Wagman and her team found that several of the breakdowns occurred within the sociotechnical infrastructure of emergency remote learning. The first was surrounding a lack of attention to privacy and security issues as teachers and parents hastily constructed a suite of tools needed to make ERL work. Across all of the 29 interviews, participants named over 80 different tools used for things like video calling, grading, learning activities, and general communication. The sheer amount of new technology being used made it difficult to answer questions from parents about how to adapt. Some parents and most elementary school children don’t have access to email, leaving teachers to make the tough decision about what other forms of contact they were comfortable sharing. In order to prioritize community and mitigate challenges, some teachers chose to give out their personal phone numbers– something they might not have chosen before the pandemic for privacy and safety reasons.
Most of these EdTech tools require some sort of authentication to protect the privacy and security of students. However, students in Pre-K and elementary school do not usually have access to emails or the ability to remember multiple passwords. This caused additional challenges for everyone involved, especially when administrators weren’t clear on an alternative method of authentication aside from email addresses.
The ambiguous and rapidly shifting school policies didn’t make things easier. As issues arose, administrators would disseminate new policies to teachers and parents. Oftentimes there would be gray areas in a policy, leaving individuals to interpret what to do when it came to things like camera usage, data sharing, and checking for engagement or attendance. Some schools required teachers to record their lessons and send them out to students after class. The recording would have children’s names, faces, and voices on them, causing data privacy concerns for teachers who didn’t have explicit instruction on what to do with those recordings after they were sent out.
Requiring students to have their cameras on opened the door for other students to see what was happening in the privacy of their peers’ homes as well. Sometimes that included inappropriate behavior, or even child abuse.
“More than one person talked about seeing inappropriate things, like child abuse,” Wagman stated. “That was tough to hear. Teachers have a responsibility when they are in the classroom to report those types of things if they suspect it. So it felt like, on the social side, we really need to think through and be clear on what teachers should be doing in this uncharted territory.”
There was an overall blurring of the boundaries between home and school during ERL that created tensions around the respective roles of parents and teachers. EdTech tools like GoGuardian allowed teachers to view the entire screen of a student, as well as close out tabs or lock them out. Schools would equate this to a similar form of redirection in the classroom when teachers saw a student disengaged, but some parents and students felt this was not only an invasion of privacy, but an overstepping of authority. A report done in 2020 supported this fear, when it was revealed that teachers could initiate calls to student webcams through GoGuardian that were automatically answered, providing a window into students’ homes without consent.
What schools were doing with the data they gathered from monitoring student devices was an additional concern for parents. Teachers were often left to carry the burden of making decisions about student data collection because administrators and policy makers weren’t doing it preemptively. That takes both time and resources to manage successfully– something teachers don’t usually have.
“Given that teachers already have a full plate, coming up with ad-hoc policies during an already stressful time in the world was tough,” said Chetty. “It also speaks to the need for more proactive approaches to privacy and security policies given that the technologies used during the pandemic are still very much being used in learning today, albeit to a lesser extent. Short of providing more funding for teachers, having schools think more clearly about these issues before disaster strikes would go a long way to capitalize on lessons learned from ERL and associated privacy and security havoc.“
Wagman reiterated that the responsibility to put policies in place should be done at the state, school, and classroom levels.
“I think of it as an interconnected web more than a top down or bottom up system. You’ve got different actors, including parents, teachers, administrators, and state policy makers. They’re all connected to each other, and it shouldn’t be the responsibility of one single actor to make decisions. We need to be asking how they are connected, how technology is embedded in that web, and how can each group contribute in a meaningful way?”
There is no magic fix to the problems facing education, but the research team suggests reframing the approach to privacy and security for young children as a labor of care. The approach implies that this is a continuous maintenance process that requires attention and dedicated resources.
“The framing of care elevates privacy and security from a secondary need to a primary component,” said Chetty. “Teachers and parents have to think about the wellbeing of kids in the 21st century, which consists of both the real and online world.”