Abstract

Many hospitals have replaced their legacy anesthesia information management system with an enterprise-wide electronic health record system. Integrating the anesthesia data within the context of the global hospital information infrastructure has created substantive challenges for many organizations. A process to build a perioperative data warehouse from Epic was recently published from the University of California Los Angeles (UCLA), but the generalizability of that process is unknown. We describe the implementation of their process at the University of Miami (UM). The UCLA process was tested at UM, and performance was evaluated following the configuration of a reporting server and transfer of the required Clarity tables to that server. Modifications required for the code to execute correctly in the UM environment were identified and implemented, including the addition of locally specified elements in the database. The UCLA code to build the base tables in the perioperative data warehouse executed correctly after minor modifications to match the local server and database architecture at UM. The 26 stored procedures in the UCLA process all ran correctly using the default settings provided and populated the base tables. After modification of the item lists to reflect the UM implementation of Epic (eg, medications, laboratory tests, physiologic monitors, and anesthesia machine parameters), the UCLA code ran correctly and populated the base tables. The data from those tables were used successfully to populate the existing perioperative data warehouse at UM, which housed data from the legacy anesthesia information management system of the institution. The time to pull data from Epic and populate the perioperative data warehouse was 197 ± 47 minutes (standard deviation [SD]) on weekdays and 260 ± 56 minutes (SD) on weekend days, measured over 100 consecutive days. The longer times on weekends reflect the simultaneous execution of database maintenance tasks on the reporting server. The UCLA extract process has been in production at UM for the past 18 months and has been invaluable for quality assurance, business process, and research activities. The data schema developed at UCLA proved to be a practical and scalable method to extract information from the Epic electronic health system database into the perioperative data warehouse in use at UM. Implementing the process developed at UCLA to build a comprehensive perioperative data warehouse from Epic is an extensible process that other hospitals seeking more efficient access to their electronic health record data should consider.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call