Skip to main content

 

about Lorrie Cranor

Lorrie CranorLorrie Cranor is Associate Research Professor of Computer Science and Engineering & Public Policy at Carnegie Mellon University. Her current research focuses on usable privacy and security. Lorrie also does work on privacy enhancing technologies, as well as Internet policy issues. She is working on applications of the Platform for Privacy Preferences (P3P), anti-phishing, and privacy and security policy management, among other things. Lorrie chaired the P3P Specification working group and designed the Privacy Bird P3P user agent.

[ email ] | [profile]

CyLab Chronicles

Q&A with Lorrie Cranor

posted by Richard Power

CyLab Chronicles: Tell us about CUPS? What was its origin? What is its mission?

CRANOR:  I founded the Carnegie Mellon Usable Privacy and Security Laboratory, or CUPS, when I joined the faculty about 4 years ago. I had started designing privacy tools for end users a few years earlier at AT&T Labs, and I realized there just wasn't a whole lot known about how to make security and privacy tools usable. When I decided to come to CMU it seemed like a great opportunity to build a research program in this area, taking advantage of all the CMU expertise in privacy, security and human-computer interaction. I have been able to collaborate with students and faculty from all of these areas--plus public policy, economics, and social and decision sciences. We now have a large number of research projects in usable privacy and security, an active seminar series, and a usable privacy and security course

CyLab Chronicles: Tell us about SOUPS? What was its origin? What can people look forward to coming out of it this year?

CRANOR: I organized a workshop on usable privacy and security in 2004, and the people who attended were eager for more. I realized there was a growing community interested in this research area and a need for a high-quality peer-reviewed conference. So, I decided to start the Symposium On Usable Privacy and Security. The conference program includes technical paper presentations, workshops, a keynote speaker, and a poster session. This year's keynote speaker is Ross Anderson, who will be talking about the economics of security.

We also have parallel "discussion sessions" that allow for interactive discussions on topics of interest to attendees. People always tell me how much they enjoy exchanging ideas with other attendees at the discussion sessions and poster session, as well as at the SOUPS meals and social events. This year we will have a wonderful dinner overlooking the tropical rainforest at the Phipps Conservatory. On the first day of SOUPS attendees can choose between two workshops -- one on usable security management and one on accessible privacy and security. SOUPS attracts a nice mix of academic researchers, as well as practitioners from industry and government. It is also a great networking opportunity for students.

CyLab Chronicles: What aspects of your work would you like to highlight?  What new avenues are you pursuing? What is particularly important right now?

CRANOR: I have 8 PhD students and thus lots of different projects going on right now. The projects are grouped roughly into three overlapping areas: anti-phishing filtering and education, privacy decision-making, and user-controllable privacy and security.
In the anti-phishing area, we have developed an online game called Anti-Phishing Phil (http://cups.cs.cmu.edu/antiphishing_phil/) and an "embedded training" system that teaches users how to protect themselves from phishing after they fall for fake phishing email (http://phishguru.org/). We have had interest in both of these training program from a large number of companies and the US military. Over 60,000 people have played Anti-Phishing Phil on our web site.

In the privacy area we are trying to build tools that make it easier for people to control the release of their personal information online, and we're trying to understand how people make decisions about their privacy. We're currently trying to develop a "nutrition label" for privacy that will allow people to easily understand and compare web site privacy policies just as they compare nutrition information on cans of soup. We developed a search engine (http:// privacyfinder.org/) that annotates search results with privacy meter icons so that people can find the web sites that will best protect their privacy. We're conducting studies to understand how people make use of this information. We have some very exciting results that demonstrate that people may actually be willing to pay a little more to make purchases from web sites that will protect their privacy.

In the user-controllable area we are building tools that allow end users to specify privacy- and security-related policies. We've developed a visual interface for setting Windows file permissions that we've demonstrated is much easier to use than the native Windows interface and prevents users from making common errors. This has led to an interesting exploration of the presentation versus the semantics of policy languages and policy authoring interfaces.

We're also conducting a series of interviews to understand the needs of security administrators and learn where the access control management tools they currently use to manage files and access to physical spaces fall short. We are leveraging the Grey system -- a smartphone-based system used to unlock doors in CyLab -- as a way of further studying access control usability issues. In addition, we're working on developing a system to allow people to exercise fine-grained control over the release of their location information in a friend finder application. Finally, we're collaborating with the Parallel Data Lab on developing usable access control for home storage systems.

An overarching goal of all of these projects is to discover design patterns and generalizable approaches to usable privacy and security. I have recently proposed a "framework for reasoning about the human in the loop" that helps enumerate all of the things that have to happen to get a human to successfully complete a security-critical task. This can be used as a way of identifying all of the things that can go wrong if you expect a human to pay attention to a security pop- up warning or comply with a corporate security policy or choose a good password. Now that we have a pretty good list of what can go wrong, the next step is to figure out how to mitigate these problems. We have some preliminary ideas, but there are a lot of wide-open research questions here, and I think this problem will keep us busy for some time to come.

CyLab Chronicles: Privacy in the 21st Century seems radically different than privacy in the 20th Century. It seems as if it is under so much pressure from different vectors -- commercial, governmental, criminal, social, etc. What should people know? What should people being doing to protect their own privacy and the privacy of their loved ones? Are different strategies required for one's financial identity, one's on-line identity, one's medical history, one's professional life? What is the important personal takeaway for an individual reading this interview?

CRANOR: It has become very easy to expose your personal information without even realizing you are doing it. And once that information is exposed, it may be archived, copied, sold, and combined with all sorts of other information. It's really hard to anticipate what might happen to your personal information. I see a lot of students who are posting information about themselves on their web sites or in blogs or social networking sites that they may not want their parents or prospective employers to see. We talk about this in the privacy class I teach, and some of the students wake up and say they are going to be more careful. But other students argue that there will be unflattering information about everyone available online, so it won't really matter to employers or anyone else if there are crazy drunken photographs of you online.

I think to some extent people will become more accepting of these things if the skeletons in all of our closets are indexed and searchable online. But I don't think that means we should give up on privacy. As more data is available online, or in corporate or government databases, the risk of abuse is increasing as well. People should be careful about what they reveal online, and in what forums they identify themselves with their true identity. And when businesses ask for information that you don't think they really need, you should push back about giving it to them and ask about opting out of it being used or shared.

Unfortunately, in the United States, we often don't have much of a choice about giving out our information if we want to do many types of transactions. But there are times you can push-back successfully. For example, many grocery stores ask for social security numbers or drivers license numbers when you request a frequent shopper card. But unless you want to use the card as a check-cashing card, they don't actually need that information. If all you want is the in- store discounts, they don't actually need any of your personal information.


See all CyLab Chronicles articles