Investigations within Security Operation Centers (SOCs) are tedious as they rely on manual efforts to query diverse data sources, overlay related logs, correlate the data into information, and then document results in a ticketing system. Security Orchestration, Automation, and Response (SOAR) tools are a relatively new technology that promise, with appropriate configuration, to collect, filter, and display needed diverse information; automate many of the common tasks that unnecessarily require SOC analysts’ time; facilitate SOC collaboration; and, in doing so, improve both efficiency and consistency of SOCs. There has been no prior research to test SOAR tools in practice; hence, understanding and evaluation of their effect is nascent and needed. In this paper, we design and administer the first hands-on user study of SOAR tools, involving 24 participants and six commercial SOAR tools. Our contributions include the experimental design, itemizing six defining characteristics of SOAR tools, and a methodology for testing them. We describe configuration of a cyber range test environment, including network, user, and threat emulation; a full SOC tool suite; and creation of artifacts allowing multiple representative investigation scenarios to permit testing. We present the first research results on SOAR tools. Concisely, our findings are that: per-SOC SOAR configuration is extremely important; SOAR tools increase efficiency and reduce context switching, although with potentially decreased ticketing accuracy/completeness; user preference is slightly negatively correlated with their performance with the tool; internet dependence varies widely among SOAR tools; and balance of automation with assisting decision making is preferred by senior participants. We deliver a public user- and tool-anonymized and -obfuscated version of the data.