Abstract
User testing is an integral component of user-centered design, but has only rarely been applied to visualization for cyber security applications. This paper describes a comparative evaluation of a visualization application and a traditional interface for analyzing network packet captures, that was conducted as part of the user-centered design process. Structured, well-defined tasks and exploratory, open-ended tasks were completed with both tools. Accuracy and efficiency were measured for the well-defined tasks, number of insights was measured for exploratory tasks and user perceptions were recorded for each tool. The results of this evaluation demonstrated that users performed significantly more accurately in the well-defined tasks, discovered a higher number of insights and demonstrated a clear preference for the visualization tool. The study presented here may be useful for future visualization for network security visualization evaluation designers. Some of the challenges and lessons learned are described.

This publication has 9 references indexed in Scilit: