|Title:||Beyond Output Voting: Detecting Compromised Replicas using Behavioral Distance|
|Authors:||Debin Gao, Michael K. Reiter, Dawn Song|
|Publication Date:||December 2, 2006|
Many host-based anomaly detection techniques have been proposed to detect code-injection attacks on servers. The vast majority, however, are susceptible to “mimicry” attacks in which the injected code masquerades as the original server software (including returning the correct service responses) while conducting its attack. In this paper we present a novel architecture to detect mimicry attacks using “behavioral distance”, by which two diverse replicas processing the same inputs are continually monitored to detect divergence in their low-level (system-call) behaviors and hence potentially the compromise of one of them. We detail the design and implementation of our architecture, which takes advantage of virtualization to achieve its goals efficiently. We apply our system to implement intrusiontolerant web and game servers, and through trace-driven simulations demonstrate that our approach can achieve low false-alarm rates and moderate performance costs even when tuned to detect stealthy mimicry attacks.
Full Report: CMU-CyLab-06-019