After three decades of existence, the world has found countless ways to use the internet. But sometimes, it feels like the internet is using us. Deceptive or hidden advertising, security flaws and data breaches, and other predatory actions can make the web feel like unsafe territory for even experienced users. Using the multidisciplinary toolbox of human-computer interaction, new UChicago CS faculty member Marshini Chetty wants to even the scales.
“I basically study how people use the internet and I try to empower them with information and control of the different aspects of internet use,” said Chetty, who started this summer as an assistant professor in the department. “I try to answer fundamental questions and then develop systems to help people actually manage or understand more about their usage in some way.”
It’s a pursuit that led Chetty from South Africa — where she grew up and started her education — to the United States, passing through Georgia Tech, the University of Maryland, and Princeton on her way to UChicago. Her current projects seek to understand and develop tools for a broad range of internet users, from young children tapping a tablet for the first time to underserved and marginalized communities to system administrators for large companies and organizations.
A research paper on the latter group, co-authored with Frank Li, Lisa Rogers, Arunesh Mathur and Nathan Malkin, recently took home the Distinguished Paper Award at the 2019 Usenix Symposium on Usable Privacy and Security (SOUPS). System administrators and how they implement software updates were put, uncomfortably, in the spotlight after the 2017 Equifax data breach, which was blamed on the failure to patch a well-known vulnerability.
Unlike casual computer users facing the occasional software update request, system administrators for corporations and government agencies must manage hundreds or even thousands of machines. Chetty’s study surveyed 102 sysadmins and found a lack of standardized procedures in the field for how to learn about new vulnerabilities and patches, when to install updates, and even whether to deploy every patch supplied by software developers.
“No matter the size of the organization, whether it was just five people or thousands of people, it was surprising to me that not everyone has a defined policy,” Chetty said. “Often we'd hear of very ad hoc policies of, maybe we'll do this update and we won't do this other update.”
Dark Patterns and Deceptive Practices
The process behind the SOUPs paper followed a familiar pattern for Chetty’s research: talk to a set of users about the problems they encounter, investigate the technical or human causes of those issues, and develop software or policy solutions with partners. Other recent projects have deployed these methods to look at misleading online information, including disguised advertisements in social media and coercive “dark patterns” deployed by shopping websites to influence potential customers.
By law, social media users who are paid to promote products or participate in “affiliate” programs with online merchants must disclose their financial relationship with “#ad” or a similar disclaimer. But Chetty and her collaborators found that many — more than 90 percent of the YouTube and Pinterest posts they examined — do not follow this rule, or hide it behind confusing language. To solve this problem, they developed AdIntuition, a browser extension that alerts YouTube viewers when the video they’re watching likely contains an undisclosed advertisement, which they are now user testing.
For the dark patterns study, the team looked at 11,000 online shopping sites for common tricks, such as imposing deadlines (“Offer ends in 15 minutes!”), emphasizing scarcity (“Only 3 left in stock!”), or social pressures (“14 people in your area have purchased this item!”). Their survey found these practices on more than 1,200 sites, and determined that over 200 of those were deceptive — providing false information to users to stimulate buying.
In both cases, Chetty and her collaborators have taken the results to the Federal Trade Commission (FTC), which oversees and regulates deceptive advertising.
“The goal of that work is to try to influence design and to try and influence policymakers,” Chetty said. “We make our findings available to regulators such as the FTC so that they can actually see what this is like as an ecosystem, to help them identify which websites are using deceptive practices so that they know which cases to prosecute.”
Protecting Vulnerable Populations of All Ages
While these deceptive practices may affect a broad cross-section of internet users, Chetty also focuses on empowering and protecting particularly vulnerable populations. With her University of Maryland colleagues Tammy Clegg and Jessica Vitak and funding from Google, she’s explored the internet usage of young children, studying how they are taught — and what they understand — about online security and privacy.
“I noticed that a lot of the online safety resources were focused at older kids, kids aged 11 and older,” Chetty said. “I found that surprising, because now kids are basically almost born with a tablet in their hands. So I was curious about how you would get younger kids to start understanding the basics of internet safety before they get to the time where they are using the internet and social media regularly.”
Chetty also returns to South Africa regularly to study technology in resource-constrained areas, conducting fieldwork with her students on topics such as how low income users handle security and privacy on mobile phones. This general process of using HCI as a framework for making technology safer and more accessible will be the subject of Chetty’s first course at UChicago in winter quarter, Inclusive Technology: Designing for Underserved and Marginalized Communities. Chetty believes that her multidisciplinary approach to HCI and computer science will thrive at the University of Chicago, and looks forward to forming partnerships both on and off campus.
“Because the university is situated in a city and has a lot of professional schools, there's a lot more opportunity for really interesting collaborations that have a broader impact,” Chetty said. “Since my research is really inspired by empowering people through technology, this seems like a really great place to do that.”