Abstract

As the public health enterprise embarks on a Data Modernization Initiative (DMI), a national effort designed to modernize public health surveillance and related information systems,1 a brief review of the recent history of public health informatics with attention to leadership lessons is in order. In this, we will highlight the history of a series of initiatives designed to improve the capacity of the public health system to manage information. By reflecting on these periods of innovation, we will share a few guiding principles and best practices that are directly relevant to challenges faced by public health leaders today and provide a look into the future. The 1970s and 1980s—the Early Years The health care system led the way in this period with a few public health innovators beginning to move into the informatics area.2 Major vendors emerged through automating financials and later expanding to automate clinical processes. Then, technology exploded with minicomputers and personal computers leading to early/primitive clinical systems for order entry and documenting medical records. Although mainframe computing supported the automation of hospital financials and logistics processes, innovations using mini- and microcomputers accelerated automated support of the processes of care across the spectrum of patient care and eventually into departmental patient data repositories that, when interfaced with other departmental systems, could provide clinicians a more complete set of data on a patient. At this stage of evolution, the best that automation could do was provide departmental repositories that could be interfaced to provide a more comprehensive set of data on a patient. To add to this functionality, hospitals had to build 1-1 interfaces between separate clinical units (eg, laboratory, pharmacy). This process was tedious and costly to maintain because any change to a system on either end of the interface meant that you had to modify or rebuild the interface. Within the public health realm, a few innovations emerged. Practice-oriented thought leaders at the Centers for Disease Control and Prevention (CDC) developed the Epi Info system that gave epidemiologists a field-ready tool, which is still used across the world today.3 At the CDC, development of the CDC WONDER system made CDC common data sets accessible to the public health practice community and is also still in use today.4 Engagement by public health innovators with standard setting groups (eg, ANSI, HL7) also started to occur leading to the realization that public health needed to get in the health informatics game or else risk having the health care delivery world tell us what data we would get, how they would be defined, and how we could access them. The 1990s—an Era of Public Health Informatics Innovation Following the publication of the Future of Public Health report5 in 1988, the CDC, under the leadership of the CDC Director Dr Bill Roper, championed the priority of strengthening the nation's public health system.6 This challenge required that the public health enterprise should define itself, its core functions, and its goals, all of which are precursors to effective use of information systems. At that point, we began to ask how public health could use data better and transform data into information and knowledge to serve public health practice.7 In the 1990s, the bar of acceptable response to policy makers and the public began to shift toward faster and more localized, relevant data. As browsers opened the Internet to the world, public health needed to gain the capacity to participate in what came to be called the “information superhighway.” Within that context, the CDC-led Information Network for Public Health Officials (INPHO) initiative represented the CDC's first major effort to enhance the nation's public health information infrastructure with a particular emphasis on strengthening state and local health agencies informatics capabilities.8 The INPHO initiative focused on 3 goals: (1) connectivity (ie, providing Internet access for these agencies), (2) information access (eg, through the creation of online information through Web sites), and (3) data exchange (eg, surveillance data). The Comprehensive Child Immunization Act of 1993 assured long-term funding to states to vaccinate all children routinely and to create an electronic infrastructure capable of documenting every child's immunization history. The innovators at The Task Force for Global Health implemented the RWJF All Kids Count initiative to help states join together in common purpose to define, standardize, and implement electronic registries in every state,9 ultimately paving the way for Immunization Information Systems (IIS), which remain today as the singular standout as an interoperable public health information system linked to the health care sector and employing uniform national standards voluntarily adopted across all states. Within the health care services sector, the pressure for cross-provider exchange of patient data accelerated the move for nationally unifying standards that would guide creation of Health Information Exchange (HIE), such as the Indiana Health Information Exchange (IHIE). The HIE showed that the public health enterprise could benefit if public health agencies linked into their regional HIE. During this period, true clinical health care system institutional automation began to occur along with the trend toward corporate health care/consolidation of community hospitals into systems of care that required integrated data environments. The 2000s—Collaboration, Legislation, and Increased Investment The 2000-2010 period was all about public health trying to catch up, to get connected to the Internet, and to explore how to link with the emerging eHealth enterprise, which was adopting electronic health record (EHR) use at a rapid pace. Major public health advances included automation in public health laboratories and the creation of the Public Health Informatics Institute, which pioneered a range of best practices in public health informatics (eg, Collaborative Requirements Development Methodology, Infolinks initiative, and Common Ground Initiative), some of which related to the growing need to enhance public health preparedness.10 At this stage, public health agencies came to recognize that while they each existed within a legally separate and unique legal envelope, when it comes to adopting and using information systems, their path to inter- and intrajurisdictional interoperability rested in understanding that all agencies and all programs share the majority of their business processes in common and, consequently, can benefit from stressing commonalities rather than minor differences. This insight coupled with a collaborative approach to defining the end technical solution empowered the IIS enterprise as it also did the public health laboratory enterprise. Then, the terrorist attack on September 11, 2001, catalyzed the efforts of the public health enterprise to improve capacity for rapid communications and information sharing. In October 1998, the CDC initiated the Health Alert Network initiative,11 which built on the conceptual and operational successes of the earlier INPHO program to further enhance state and local public health agency informatics capacity. On September 11, 2001, the first Health Alert message was issued using the Health Alert Network infrastructure, which had been developed over previous months. As additional resources became available in subsequent months, the Health Alert Network program fulfilled its 3 goals: (1) ensure robust electronic communications at all full-function local health jurisdictions, (2) ensure capacity at every local health jurisdiction to receive distance learning offerings from the CDC, and (3) ensure a comprehensive capacity across all levels of the public health system to receive and broadcast urgent health alerts.11 Other innovations included the Connections community of practice aimed at helping states integrate child health systems by learning from one another about what approaches to cross-program data integration worked for different agencies. This initiative was the first public health effort to directly involve the medical practitioners and health care finance entities.12 In 2002/2003, the Department of Health & Human Services (HHS) created the Office of the National Coordinator for Health IT (DHHS/ONC). At its inaugural 2003 conference, delegates from health care organizations, academic informaticists, EHR companies, health insurance/payers, and public health informatics joined to declare a framework for advancing the adoption of EHRs and regional health information exchanges. This framework called for the creation of a certifying body that could measure each EHR product against a set of essential functional capabilities and standards, where applicable. Thus, the Certification Committee for Health IT (CCHIT) was born to provide objective expert determinations of each product that sought the CCHIT certification. The CCHIT certification quickly became a necessary label of excellence demanded by prospective EHR clients. The CCHIT included public health representatives who could provide the public health perspective on relevant aspects of EHRs. The 2009 American Recovery and Reinvestment Act (ARRA) established the Health Information Technology for Economic and Clinical Health Act (or ARRA HITECH Act). Through this legislation, the Centers for Medicare & Medicaid Services (CMS) provided incentive payments to health care providers to be “meaningful users” of EHRs. Public health needed to benefit from this massive investment, but to do so they needed to present a national “one public health” vision to the HHS Implementation Hearings because the HITECH Act did not specifically call out public health. To ensure that public health was included, the major public health associations joined together under the auspices of the Joint Public Health Informatics Taskforce to craft a unified argument and presentation to the CMS. Their successful work led to HITECH including funding for IIS, electronic laboratory reporting, and syndromic surveillance. Absent public health unifying its message and request under the Joint Public Health Informatics Taskforce, it is likely that public health would have been ignored completely in deference to the dramatic need to automate health care to drive overall improved outcomes at lower cost. The 2010s—Public Health Preparedness: Linking Health Care and Public Health Between 2010 and 2019, public health was challenged with marshaling an effective response to 2 major outbreaks: the HIN1 influenza epidemic13 and the 2014 Ebola virus outbreak in West Africa that led to cases in the United States.14 The response to these outbreaks demonstrated the continuing deficits of public health surveillance systems. As US health care was rapidly adopting EHRs, public health lagged because of staffing and budget cuts stemming from the great recession of 2008-2010. In 2009, the novel influenza A (H1N1) virus was detected first in the United States and spread quickly across the United States and the world. Throughout 2010, the United States mounted a complex, multifaceted, and long-term response to the pandemic. Among these efforts was the need for the CDC, its supporting partner organizations, and all public health agencies to rapidly define and implement emergency department data reporting to local, state, and federal authorities. These outbreaks demonstrated to all that public health's surveillance capabilities and related informatics infrastructure needed much more attention.15 The Ebola outbreak in particular demonstrated the fragility of handling outbreaks of national consequence in the traditional manual data-reporting manner. Furthermore, the public health workforce was ill-equipped to design, develop, and manage these new information systems needed to support outbreak response. To address this need, the Public Health Informatics Institute (with support from the de Beaumont Foundation) was incentivized to create the Public Health Informatics Academy to bring practitioner-relevant low-cost informatics education to frontline public health agencies. The Academy demonstrated that high-value informatics education priced reasonably will attract an audience. By 2015, informatics leaders in clinical medicine and public health joined forces with the Robert Wood Johnson Foundation to launch the Digital Bridge, an initiative that forged collaborative governance over data exchange between health care provider organizations and public health. Digital Bridge demonstrated the truth that “data moves at the speed of trust.”16 By the close of the decade, most public health leaders at the local, state, and federal levels understood that functions such as the Digital Bridge and future innovations were going to be essential to their ability to acquire essential data. The 2020s—the COVID Pandemic: Unprecedented Opportunity for Informatics Innovation The SARS-CoV-2 virus indelibly stamped its name on 2020. The COVID pandemic demonstrated the massive challenge of providing accurate, jurisdiction-specific data on new infections, hospitalizations, and deaths in a way that local, state, and federal health authorities and policy makers could make challenging judgments. The federated public health system struggled, in part, due to the lack of federal requirements for sharing data with the CDC. Data-sharing agreements speak directly to the sociology of informatics, a topic that has been left to crises to resolve. We remain with significant unfinished business on that data-sharing front. As we have moved into 2022, the Biden Administration passed infrastructure funding that includes a new initiative aimed at data modernization (DMI). The DMI will require national agreement on a framework for surveillance and with it eventually a collaboratively defined, endorsed, and implemented technology standards that will finally position public health as a 21st century data enterprise.17 The HHS DMI speaks mostly, maybe exclusively, to public health surveillance systems, which ultimately tie to broader health information systems. Now, the public health community must proceed to create a framework that unites the states under a common way of understanding not only the who, what, when, and why of surveillance but most importantly the how. The “how” will consist of the essential systems architecture (ie, data and transmission standards, transfer protocols, data management protocols, etc). Lessons Learned From the Past From these experiences, we offer a few guiding principles to foster a mindset for future innovation. Public health leaders need to know that investments in informatics, be that in personnel, tools (systems, apps, etc), or innovation, should be logically associated with end health outcomes and health impact. Furthermore, having a complete and comprehensive map of the work of public health (ie, business process workflows) can lead to improved efficiency and effectiveness while also helping one be clear about how much improvement in the current workstreams should result from the informatics investment.10 Therefore, these are investments for which leaders should demand a clear return of investment. A Look Into the Future Recently, public health practice has moved from a mindset of rigid categoricalism to necessary systematization. Past history of rigid categorical funding established highly functional single-purpose programs (aka silos) and prevented more logical cross-functional system development. As a result, the public health system now faces a new decade of investment in public health data and information infrastructure, beginning with a laser focus on essential surveillance systems.1 Leaders should make the most of it by being a constructive part of the national partners coalition formed by the CDC and by helping your health agency work toward forging, accepting, and adopting standards-based systems and adopting cloud-based solutions that your IT department does not own and control. Public health leaders should embrace and lobby on behalf of cloud-based solutions that will be vastly more powerful yet less costly. The future of public health rests in its use of data and not in its management of hardware technologies. The future of public health will likely lead us to a shared data analytics capability wherein a jurisdiction's data can be held confidential subject to their control while also benefitting from the power that major cloud computing platforms bring. Clearly, for many states this may require state legislative rethinking about how a state's data assets are managed. To make the DMI investment payoff, states will necessarily adopt many more common methods and standards. To be very clear, the most important of all investments must be those in the workforce. The public health workforce of the 2020 to 2050 time frame must be capable of delivering near real-time, highly granular, and rapidly trending facts and analysis to their public across the range of environmental hazards, weather-related, infectious, and noninfectious threats to health. Failure to do this well will relegate public health to the side bench and allow information providers whose interest may not be assuring improved health for all people to have a controlling voice in shaping health policy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call