Modern security operation centers (SOCs) rely on human operators and a tapestry of diverse logging and alerting tools with large-scale collection and query abilities. Yet, SOC investigations are tedious as they rely on manual efforts to query diverse data sources, overlay related logs, and correlate the data into information and then document results in a ticketing system. Security orchestration, automation, and response (SOAR) tools are a relatively new technology that promise, with appropriate configuration, to collect, filter, and display needed diverse information; automate many of the common tasks that unnecessarily require SOC analysts’ time; facilitate SOC collaboration; and, in doing so, improve both efficiency and consistency of SOCs. SOAR tools have never been tested in practice to evaluate their effect and understand them in use. In this paper, we design and administer the first hands-on user study of SOAR tools, involving 24 participants and 6 commercial SOAR tools. Our contributions include the experimental design, itemizing six characteristics of SOAR tools, and a methodology for testing them. We describe configuration of a cyber range to the test environment, including network, user, and threat emulation; a full SOC tool suite; and creation of artifacts allowing multiple representative investigation scenarios to permit testing. We present the first research results on SOAR tools. We found that SOAR configuration is critical for success, as it involves creative design for data display and automation and should involve iteration with users and vendors. We also found that SOAR tools increased efficiency and reduced context switching during investigations, although ticket accuracy and completeness (potentially indicating investigation quality) decreased with SOAR use. Our findings indicated that user preferences from usability studies are slightly negatively correlated with their performance with the tool; overautomation was a concern of senior analysts, and SOAR tools that balanced automation with assisting a user to make decisions were preferred. We also found that SOAR dependence on constant internet varies widely with the tool. Finally, we deliver a public user- and tool-anonymized and -obfuscated version of the data from the study to assist future research.
Published: April 18, 2023
Bridges R.A., A.E. Rice, S. Oesch, J.A. Nichols, C. Watson, K. Spakes, and S. Norem, et al. 2023.Testing SOAR Tools in Use.Computers & Security 129.PNNL-SA-176831.doi:10.1016/j.cose.2023.103201