Understanding the evolutionary roots of privacy
Caitlin Kizielewicz
Feb 22, 2022
As people’s activities and interactions continue to expand into the digital realm, concerns about digital privacy and its invasions—often expressed in terms of data rights and internet surveillance—abound. A new analysis of these issues explores the concept of privacy not as a modern anomaly, but as a concept that spans time and space, social class, and degree of technological sophistication, with an eye toward informing the future of digital privacy.
The analysis, by researchers at Carnegie Mellon University (CMU), the University of Arizona, and Stanford University, is published in Science.
“The history of privacy, as well as research from history, anthropology, and ethnography, tell us that the drive to maintain a private space is universal,” explains CyLab’s Alessandro Acquisti, professor of information technology and public policy at CMU’s Heinz College, who coauthored the analysis. “That drive is partly cultural and partly linked to visceral roots. As private spaces become harder for individuals to manage, understanding and accounting for those ancestral roots of privacy may be critical to securing its future.”
The history of privacy, as well as research from history, anthropology, and ethnography, tell us that the drive to maintain a private space is universal.
Alessandro Acquisti, professor, Heinz College
The authors delve into the evolutionary roots of privacy. From Chinese thinkers in the 3rd century BC to Javanese people in the 1950s, what society refers to today as privacy may have evolved from physical needs for security and self-interest, as well as the desire to ensure survival and evolutionary success. According to this account, people today are experiencing a privacy gap, or mismatch, in which individuals who engage digitally lack sensorial cues that suggest the presence of others. As a result, they are less equipped to make informed decisions about their digital privacy.
“A privacy mismatch implies that the ability to regulate privacy boundaries—such as choosing when to share and when to protect—is in an individual’s best interest, and when that evolutionarily rooted ability is impaired, individuals become more vulnerable online to new threats,” notes Laura Brandimarte, assistant professor of management information systems at Arizona, who coauthored the analysis.
While surveillance risks offline and online differ in numerous ways, the privacy mismatch has implications not just for personal privacy but also for privacy at the societal level: The collective ramifications of individual disclosures become more evident, spurring filter bubbles, amplifying disinformation, and increasing political polarization, with implications for democratic elections and even societal safety.
An evolutionary account of privacy may explain the hurdles in protecting privacy online and the seemingly careless online behaviors of individuals who claim to care about their privacy. It also helps explain why a dominant approach to privacy management in the United States—notice and consent—has failed to address problems.
While notice and consent is popular with the private sector, an overreliance on such mechanisms is ineffectual and can backfire because people may be wired to react to privacy invasions viscerally, not just deliberatively. Asking people to become responsible for problems they did not create and cannot control generates unequal burdens, disadvantaging certain groups more than others.
“To maintain privacy, society must embed it into the fabric of our digital systems,” says Jeff Hancock, professor of communication in the Stanford School of Humanities and Sciences, who coauthored the analysis. “Any approach that places not just the ability to choose but also the responsibility to protect individuals on themselves will likely fail as privacy mismatches continue to rise in frequency and significance.”
Attempts to reproduce online the visceral cues of the physical world are unlikely to solve the problem without a combination of technology and policy interventions that embed safeguards into information systems, as has been done, for instance, with the auto industry. Policy changes can also foster the use of technologies to make the safeguards possible without harming modern society’s reliance on data.
The authors call for regulations that account for what we know about privacy and foster the mass-scale use of privacy technology. Such efforts could include mandating compliance with user-centered privacy technologies in products and services, as well as privacy-preserving algorithms among data holders and corporate practices that minimize user burden and the likelihood of coercion and manipulation. These goals can be achieved by setting standards, coordinating research and development efforts, leveraging incentives, and instituting penalties and fines for noncompliance.
The analysis was funded by the National Science Foundation, the Alfred P. Sloan Foundation, and the Carnegie Corporation of New York.
Paper reference
How privacy’s past may shape its future
- Alessandro Acquisti, Carnegie Mellon University
- Laura Brandimarte, University of Arizona
- Jeff Hancock, Stanford University