I am a PhD student at the University of Maryland, advised by Michelle Mazurek. My research interests focus on the intersection of computer security and everyday people. In particular, I am interested in populations with specific security concerns due to circumstance or demographic that don’t fall easily under the label of “general users”.
BA in Computer Science, 2018
Oberlin College
BM in Organ Performance, 2018
Oberlin Conservatory
At-risk users are people who experience risk factors that augment or amplify their chances of being digitally attacked and/or suffering disproportionate harms. In this systematization work, we present a framework for reasoning about at-risk users based on a wide-ranging meta-analysis of 95 papers. Across the varied populations that we examined (e.g., children, activists, people with disabilities), we identified 10 unifying contextual risk factors — such as marginalization and access to a sensitive resource — that augment or amplify digital-safety risks and their resulting harms. We also identified technical and non-technical practices that at-risk users adopt to attempt to protect themselves from digital-safety risks. We use this framework to discuss barriers that limit at-risk users’ ability or willingness to take protective actions. We believe that researchers and technology creators can use our framework to identify and shape research investments to benefit at-risk users, and to guide technology design to better support at-risk users.
End users learn defensive security behaviors from a variety of channels, including a plethora of security advice given in online articles. A great deal of effort is devoted to getting users to follow this advice. Surprisingly then, little is known about the quality of this advice: Is it comprehensible? Is it actionable? Is it effective? To answer these questions, we first conduct a large-scale, user-driven measurement study to identify 374 unique recommended behaviors contained within 1,264 documents of online security and privacy advice. Second, we develop and validate measurement approaches for evaluating the quality – comprehensibility, perceived actionability, and perceived efficacy – of security advice. Third, we deploy these measurement approaches to evaluate the 374 unique pieces of security advice in a user-study with 1,586 users and 41 professional security experts. Our results suggest a crisis of advice prioritization. The majority of advice is perceived by the most users to be at least somewhat actionable, and somewhat comprehensible. Yet, both users and experts struggle to prioritize this advice. For example, experts perceive 89% of the hundreds of studied behaviors as being effective, and identify 118 of them as being among the “top 5” things users should do, leaving end-users on their own to prioritize and take action to protect themselves.
We are researchers at the University of Maryland, looking to speak with researchers and professionals who have worked in media journalism.
Participants must fulfill the following eligibility requirements:
If you are interested, please fill out the form here: Link to Calendly scheduler