Skip to main content

Design Methodologies for Security Applications

Researcher: Roy Maxion

Cross Cutting Thrusts: Formal Methods


Design Methodologies for Security Applications Targeting Ambient Intelligence

User interfaces may be the Achilles heel of security-critical systems; if the interface inadvertently induces the administrator to make an undetected mistake, unintended security settings could leave the system open to attack or insider malicious activity. Such mistakes were .rst demonstrated in a well-known article entitled “Why Johnny Can’t Encrypt,” in which users were misled by the interface to commit errors as stupendous as exposing a private key. Similar exposures have also been demonstrated in our CyLab-sponsored work on the Windows XP security interface. Because of the widespread use of XP, this is potentially a far more serious problem than not being able to encrypt.

This is a proposal for renewal/continuance of the same project from last year. Progress over the course of the project so far has been substantial, including the design, implementation and initial testing of a prototype interface-measurement workstation. This prototype gathers precisely timed and synchronized multidimensional data from a user session, and plays it back for human and machine analysis to determine the loci and cause of user error. This monitoring and analytical approach is unprecedented in the history of user interface evaluation, and it has already led us to discover critical operational flaws in the Windows XP interface for setting security and .le-permission bits. We have run experiments to validate our hypothesis that the XP interface violates the principle of out-of-sight-out-of-mind; this is the principle that accounts for loss of awareness of complex rules that must be applied at times when knowledge of the rules is critical to correct operation. This phenomenon has been demonstrated in the XP security interface, using the software system developed so far under last year’s award. A technical report is available.

Under a renewed award we plan to extend the measurement workstation to include more sensors, as well as an automated system for detecting user difficulties. This automated system will use some of the anomaly detection algorithms that we’ve developed already for other purposes; they will be retargeted for automated detection of user errors. The result will be that a user interface session can be automatically analyzed to determine the places where users had problems, so that remedial measures can be taken. A further goal is to be able to assess a user-interface design, a priori, to determine its .aws in advance of implementation. In addition, we plan to explore automated false-alarm mediation on the basis of an evidential calculus that will enable the system to automatically rule out detections of events that do not represent user error. With the kinds of instrumentation being developed under CyLab sponsorship, these kinds of things are closer to reality than ever before.

Plan of work.

  • Develop an error taxonomy that predicts where users will have problems with interfaces. This builds on a growing understanding of the kinds of mistakes users make, and the conditions under which such mistakes are made (e.g., out-of-sight-out-of-mind errors, among others).
  • Develop an automated system (based on anomaly detection) for locating user interface problem areas; this technique employs analysis of multidimensional user data already available with what was achieved so far in the project.
  • Develop and test, under controlled conditions, alternative interfaces that obviate entire classes of user error. Demonstrate that such interfaces not only can be built, but that they are not hard to build and are more effective than current interfaces.
  • Initiate first cut at an evidence calculus that can be used to mitigate false detections.
  • Produce technical report; present at international workshop on dependable interfaces.

The work will result in a prototype suite of tools with which experimenters can conduct controlled empirical evaluations of user interfaces on various applications. This tool is unique in the country.