Finding privacy choices on websites is hard for average users.
Don’t blame them; experts also find it hard.
Daniel Tkacik
Jun 11, 2020
In a study published last year, a group of CyLab researchers found that many websites make it difficult for people to find privacy settings or opt out of targeted advertising. Those findings were based on expert opinion, so the researchers pondered, how hard is it for real users to access these choices?
The answer is obvious, but worth documenting in a scientific manner: very hard.
“It’s what I call a scavenger hunt,” said one participant in a new study presented at this year’s ACM CHI conference.
In their study, the researchers invited 24 people–13 females and 11 males with a variety of educational backgrounds and occupations–to their lab and asked each of them to find various privacy choices in various popular websites’ account settings and privacy policies, including those of nytimes.com, foodandwine.com, and others. The tasks were described to the participants as the following scenarios:
- You just got the tenth update email from [website] today, and now you want to stop receiving them.
- You’ve been seeing advertisements on [website] for a pair of shoes that you searched for last month, and now you want to stop seeing them.
- You’re uncomfortable with [website] keeping a record of your location, and want to remove all of your data from the company’s databases.
Opting out of email marketing proved to be a relatively easy task to participants, as most of them looked for or used an unsubscribe link in an email sent by the website. This may be thanks to the Controlling the Assault of Non-Solicited Pornography and Marketing (CAN-SPAM) Act of 2003, which requires companies to provide clear explanations for how to opt-out of email marketing.
“We see regulation having a really great impact on usability, since it standardized the location and look of email opt-outs,” says Hana Habib, a Societal Computing Ph.D. student and lead author of the study. “Since the law is now 17 years old, people have formed expectations.”
That’s about where the good news ended, and the scavenger hunt began. Almost all participants required assistance finding privacy choices in websites’ account settings and privacy policies.
People really struggled. But it would be wrong to blame the participants themselves.
Hana Habib, Ph.D. student, Societal Computing
“People really struggled,” says Habib. “But it would be wrong to blame the participants themselves.”
On one website’s help page focused on assisting people in deleting their data, a box was shown that read, “Delete your data.” The box appeared clickable, but it wasn’t.
“People tried clicking the box but were left confused when nothing happened,” says Habib. “Much of the difficulty in making these privacy choices is due to poor design and formatting.”
Many participants gave up and resorted to visiting the website’s help page, scrolling through various support pages related to their task. Some even found a contact on the website and asked them directly via email or chat, “How would I go about this?”
Overall, participants had an easier time finding privacy choices in websites’ account settings versus the usually long, jargon-filled privacy policies. But the key is that they only found it easier, not always easy.
As a result, the researchers make a set of recommendations to companies to make privacy choices easier to find and use. Just like “unsubscribe” links were standardized to appear in the same locations in email footers, privacy choices could also appear in standard locations on websites. They also recommend that there be multiple paths from various parts of the website that lead to a standard location with those privacy choices.
The pressure is building up for companies to make these choices easier for people to use.
Hana Habib, Ph.D. student, Societal Computing
“The pressure is building up for companies to make these choices easier for people to use,” says Habib. “It’s up to us as academics to communicate our findings with external stakeholders like regulators or companies themselves.”
This study was conducted as part of the Usable Privacy Policy Project. Other authors on the study included:
- Sarah Pearman, Ph.D. student in Societal Computing
- Jiamin Wang, undergraduate student in Electrical and Computer Engineering and Engineering and Public Policy (EPP)
- Yixin Zou, Ph.D. student at the University of Michigan (U-M)
- Alessandro Acquisti, professor in the Heinz College
- Lorrie Cranor, director of CyLab, professor in EPP and the Institute for Software Research (ISR)
- Norman Sadeh, professor in the School of Computer Science
- Florian Schaub, assistant professor at U-M