Evaluating Intrusion Detection Systems: The 1998 DARPA Off-line Intrusion Detection Evaluation

Evaluating Intrusion Detection Systems: The 1998 DARPA Off-line Intrusion Detection Evaluation

| Richard P. Lippmann, David J. Fried, Isaac Graf, Joshua W. Haines, Kristopher R. Kendall, David McClung, Dan Weber, Seth E. Webster, Dan Wyschogrod, Robert K. Cunningham, and Marc A. Zissman
The 1998 DARPA Off-line Intrusion Detection Evaluation, conducted by MIT Lincoln Laboratory, aimed to evaluate the effectiveness of intrusion detection systems (IDS) using a comprehensive test bed that simulated a government site with hundreds of users and thousands of hosts. The evaluation involved more than 300 instances of 38 different automated attacks over seven weeks of training data and two weeks of test data. Six research groups participated in a blind evaluation, focusing on probe, denial-of-service (DoS), remote-to-local (R2L), and user to root (U2R) attacks. The results showed that the best systems detected old attacks with moderate detection rates ranging from 63% to 93% at a false alarm rate of 10 false alarms per day. However, detection rates were significantly lower for new and novel R2L and DoS attacks, with the best systems failing to detect about half of these attacks, including damaging access to root-level privileges by remote users. This suggests that current rule-based approaches need further development to effectively detect new attacks. The evaluation also highlighted the importance of ROC analysis in assessing IDS performance, as it provides a comprehensive view of detection rates and false alarm rates. The study concluded that future research should focus on developing techniques to detect new attacks rather than extending existing rule-based methods.The 1998 DARPA Off-line Intrusion Detection Evaluation, conducted by MIT Lincoln Laboratory, aimed to evaluate the effectiveness of intrusion detection systems (IDS) using a comprehensive test bed that simulated a government site with hundreds of users and thousands of hosts. The evaluation involved more than 300 instances of 38 different automated attacks over seven weeks of training data and two weeks of test data. Six research groups participated in a blind evaluation, focusing on probe, denial-of-service (DoS), remote-to-local (R2L), and user to root (U2R) attacks. The results showed that the best systems detected old attacks with moderate detection rates ranging from 63% to 93% at a false alarm rate of 10 false alarms per day. However, detection rates were significantly lower for new and novel R2L and DoS attacks, with the best systems failing to detect about half of these attacks, including damaging access to root-level privileges by remote users. This suggests that current rule-based approaches need further development to effectively detect new attacks. The evaluation also highlighted the importance of ROC analysis in assessing IDS performance, as it provides a comprehensive view of detection rates and false alarm rates. The study concluded that future research should focus on developing techniques to detect new attacks rather than extending existing rule-based methods.
Reach us at info@study.space
[slides and audio] Evaluating intrusion detection systems%3A the 1998 DARPA off-line intrusion detection evaluation