Opting out of data use is hard, but it doesn’t have to be

Daniel Tkacik

Aug 15, 2019

Many websites share your browsing data, such as the links you click, with advertisers who can then show you more personalized ads. This kind of practice is the rule, not the exception. And while websites often offer the chance to opt out of this kind of data use, good luck figuring out how to do that.

Woman standing at a podium next to a slide presentation

Source: Carnegie Mellon University's College of Engineering

Co-author and University of Michigan Ph.D. student Yixin Zou presents their study at the Symposium On Usable Privacy and Security in Santa Clara, Calif.

That’s the gist of a recent study by researchers from Carnegie Mellon University and the University of Michigan.

“With new regulatory pressures and consumer awareness, there has been a lot of pressure for companies to start offering privacy choices, especially privacy choices that people can actually use,” says CyLab’s Hana Habib, a Societal Computing Ph.D. student and the lead author on the study. “We wanted to learn more about the practices that websites currently use to offer different types of privacy choices.”

The study is being presented at this week’s Symposium On Usable Privacy and Security in Santa Clara, Calif.

The researchers focused on three main privacy choices: opt-outs for email marketing, opt-outs for targeted advertising, and data deletion choices. To do so, they evaluated a sample of 150 English-language websites ranging in popularity.

“On a positive note, the privacy choices we focused on were pretty prevalent across the whole range of popularity,” says Habib. “However, the ways in which they were offered were inconsistent from website to website.”

The privacy choices we focused on were pretty prevalent across the whole range of popularity

Hana Habib, Ph.D. student, Societal Computing

For example, some websites only offered targeted advertising opt-outs in the account settings, while other websites offered them in the privacy policies. Even within the privacy policies, the researchers didn’t observe a standard terminology that these choices were placed under. 

“The terms being used by these websites were very inconsistent,” says Habib. “If someone did manage to take a look at the privacy policy and find a privacy choice, that knowledge wouldn’t really transfer to another website.”

The researchers also recorded the number of “user actions” – clicks, hovers, form fields, radio buttons, or check boxes – required to reach the point of applying the privacy choice from the website’s home page. Opting out of targeted advertising required the fewest (3.16 user actions, on average) while opting out of email marketing and requesting data deletion required the most (5.32 user actions, on average).

The terms being used by these websites were very inconsistent.

Hana Habib, Ph.D. student, Societal Computing

The most extreme case was observed in the New York Times’ data deletion request form, which required 38 user actions to complete.

These results may leave readers scratching their heads. Why are companies making it so hard for users to opt out? Some may argue that companies lose money when users opt out of targeted advertising, but Habib says that’s not necessarily true. 

“There isn’t a whole lot of evidence to support that targeted advertising results in high profits compared to just contextual, non-targeted advertising,” says Habib.

The researchers offer several design and policy recommendations, such as using standardized terminology in privacy policies, and placing privacy choices in a centralized, predictable location. Given how cumbersome the process of making these privacy choices can be, they recommend simplifying the process by offering a prominent “one click” opt-out box for users. 

Other authors on the study included University of Michigan (UM) Ph.D. student Yixin Zou, CyLab graduate researcher Aditi Jannu, CyLab research intern Neha Sridhar, CyLab research associate Chelse Swoopes, Heinz College professor Alessandro Acquisti, CyLab director Lorrie Faith Cranor, Institute for Software Research professor Norman Sadeh, and UM professor Florian Schaub.