Beyond the fine print
CyLab research reimagines how we make online privacy choices
Michael Cunningham
Nov 14, 2025
Lorrie Cranor, CyLab director, moderates a panel discussion at the 2025 CyLab Partners Conference
Every day, people around the world click “agree” without thinking twice.
When utilizing digital platforms, the average user encounters countless requests to share personal information, often through interfaces that are dense, confusing, or intentionally designed to rush decisions.
What many individuals don’t realize, according to new CyLab research, is that the way these choices are presented may be quietly influencing how much privacy they actually have.
Led by Lorrie Cranor, CyLab director, a team of researchers recently set out to explore how digital platforms can better empower users to make informed privacy decisions. Their study, “Interface Design to Support Informed Choices When Users Face Numerous Privacy Decisions,” examines how subtle design decisions—like grouping settings, spreading them across multiple screens, or preloading certain options—affect what users choose and how well they understand those choices.
The researchers designed a fictional social media platform called RightNow to test how interface features shape behavior. More than 500 participants were asked to configure 17 privacy settings across one of seven interface versions. Some of the participants saw one setting per screen, while others were given the opportunity to choose from multiple settings.
Some versions bundled settings by function—like notifications or ads—while others grouped them by how invasive they were. Still others included “presets”: ready-made configurations that automatically applied a set of privacy options with a single click, effectively a “shortcut” to save time.
The results were striking. Participants who used presets were significantly more likely to choose less privacy-protective options than those who selected settings individually. For example, 57 percent of preset users agreed to have their posts used in advertisements, compared to just 10 percent of users who made each choice themselves.
Moreover, participants who relied on presets were less likely to correctly answer questions about the settings they had selected, suggesting that the shortcuts may have reduced their actual understanding of what they had agreed to.
The findings reveal that efforts to make digital experiences easier often backfire, creating “privacy fatigue” and leading people to relinquish control over their personal information.
“People want to make smart choices about their data,” said Cranor. “But when interfaces are confusing or use shortcuts like presets, users can end up giving away more information than they intended.”
Something as simple as the words that label a button or a default setting can have a huge effect on what people do.
Lorrie Cranor, Director, Carnegie Mellon University's CyLab Security and Privacy Institute
While features like presets had a major effect, other aspects of design, such as the number of screens or how settings were grouped, had only a modest impact on behavior. Still, user preferences were clear: most participants favored seeing between three and seven settings per screen, and about 70 percent of the participants preferred to configure their privacy options at the beginning of their experience with the app rather than later on in the process.
“These insights point to a key challenge for designers: creating systems that balance clarity, convenience, and control without overwhelming the user,” said Cranor.
For policymakers, the study carries equally important implications. As digital privacy continues to move to the center of legislative debates in DC and beyond, CMU’s research underscores how much influence interface design can have on user autonomy.
The researchers also advocate for user studies to become standard practice in privacy interface design. Just as clinical trials test the safety of new medicines, usability testing should evaluate whether digital privacy tools truly support informed decisions. Cranor noted that the research team had intended their preset design to be neutral and had not realized that it might be deceptive until they saw the user study results.
“Something as simple as the words that label a button or a default setting can have a huge effect on what people do,” said Cranor. “We need to make sure those effects are understood and disclosed so that consent is genuine and informed.”
Finally, the study points to the importance of timing and context in privacy decisions. While most participants preferred to make their choices upfront, others might benefit from a “just-in-time” approach, where privacy settings are presented at the moment they’re relevant, such as when a user is about to share a photo or turn on location tracking. Contextual prompts could help reduce fatigue while still providing meaningful control to users.
By exposing how design choices can subtly shape behavior, Cranor and her colleagues are helping policymakers and industry leaders rethink what informed consent really means in the digital age.
Co-authors of “Interface Design to Support Informed Choices When Users Face Numerous Privacy Decisions,”:
- Eman Alashwali, former CyLab postdoc
- Andrea Miller, CMU Software and Societal Systems Department (S3D) Ph.D. student
- Alexandra Nisenoff, S3D Ph.D. student
- Kyerra Norton, former CyLab undergraduate research assistant
- Ellie Young, former S3D Ph.D. student
- Prerna Bothra, former CyLab research assistant
- Mandy Lanyon, CMU Social and Decision Sciences Research Associate