In today’s classroom, educational technologies are now as ubiquitous as pencils, rulers, and notebooks. Platforms such as Google Classroom and Blackboard, devices such as iPads and Chromebooks, and — during the COVID-19 pandemic remote learning — video conferencing tools such as Zoom have become essential, day-to-day tools for students and teachers. But like all commercial and web-based technologies, these tools carry security and privacy risks, made all the more complicated due to their use by young children and teens.
To date, little research has been conducted on these issues, including what data these devices and software collect on schoolchildren and how that information is subsequently used. With a new five-year NSF CAREER grant, UChicago CS assistant professor Marshini Chetty will conduct a large scale study on educational technologies used by primary and middle school students, using approaches from human computer interaction (HCI) such as user-centered design.
The project will catalog the educational technology landscape, assess the risks of these tools, survey parents, students, and teachers about their experiences with educational tech, and develop new tools that protect student privacy and online security.
“We need to understand how these educational technologies affect kids’ privacy over time,” said Chetty, whose research focuses on making the internet more trustworthy and inclusive. “We need to know the answers to some of these questions so that we can update regulations and ensure that everyone has a good understanding of what data is being collected, for what purpose, and make sure that the school has what it needs while students’ privacy is protected.”
The project originated long before the pandemic and remote schooling put educational technologies in the spotlight, Chetty said. In previous work at Princeton University and the University of Maryland, Chetty and her collaborators studied how schools teach children about privacy and security risks when using the Internet, both inside and outside the classroom. But during that work, many teachers expressed concern about the technologies used in school and their potential for misuse, such as the ability to remotely control a school laptop’scamera or collect data on student activities, educational or otherwise.
Subsequently, Chetty found that existing regulations and laws, such as the Children’s Online Privacy Protection Act (COPPA) and the Family Educational Right and Privacy Act (FERPA), do not address many of these concerns around educational technologies. In many cases, so little is known about what data the technologies collect — and what companies do with it — that it would be difficult to regulate, she said.
“When you have lots of technologies being adopted by school districts, the school has really good reasons for it: often, it's cheap and bundled, so you get a whole host of services for teachers and students,” Chetty said. “Where it becomes problematic is that, while the school gets some data from the company, we don't really have a good understanding of what the company itself is using in terms of data collection. Is it being collected for the company to profit from eventually? How long is it retained? Who has access to the data? How much control do parents and teachers and kids have? These are all open questions for both science and policy.”
To fill in these unknowns, Chetty will lead an effort examining educational technologies currently used by children in grades 3-7 in two of the largest American school districts: Chicago and Houston. The study will survey parents, children, and teachers about their use of educational technologies before, during, and after the pandemic, with a focus on their knowledge and concerns around data collection. Simultaneously, the team will assess the 100 most-used educational technologies for their data collection practices, privacy policies, and other relevant features.
The researchers will then conduct “co-design” efforts with parents, teachers, and children, developing new approaches and technologies that protect student privacy when using these tools. Drawing upon the interdisciplinary approaches of human-computer interaction, co-design could create privacy protections ranging from plugins that inform a student when data is being collected and how it will be used to low-tech solutions such as a physical “shutter” for webcams. Previously, Chetty has conducted similar policy-oriented research for online protections against manipulative “dark patterns” on shopping sites and undisclosed paid endorsements on social media.
“The idea with co-design is to work with each of these stakeholder groups and try to come up with something that could be either integrated into what's already there, or pave the way for something new that could make these tools more transparent and controllable,” Chetty said.
The project will also work with UChicago CS student organization compileHer and the University of Chicago Office of Special Programs’ college readiness program on outreach to local schools and students, and hold seminars to inform parents of students about privacy and data collection. More broadly, the project’s findings could help policymakers draft new regulations and protections for student privacy, and help educators and school systems make decisions and create procedures that balance the promise and risks of educational technologies.
“It would be a huge achievement if I could actually affect some of these regulations, or even have some new regulations come out,” Chetty said. “I'd love for us to be able to impact schools on a larger scale, helping schools to understand the privacy implications of choosing certain technologies for learning.”