Abstract

Few statements in the IT era have provoked as much opposition. If the expansion of electronic health records (EHRs) under the 2009 Recovery Act's Health Information Technology for Economic and Clinical Health section turns McNealy's position from a cynical extreme to a systemwide norm, patient privacy advocates predict a backlash that will entangle IT's potential clinical benefits within a thicket of perverse incentives.On July 13, the Department of Health and Human Services (DHHS) Centers for Medicare & Medicaid Services announced regulations defining “meaningful use” as required for Health Information Technology for Economic and Clinical Health incentive payments.2Centers for Medicare & Medicaid Services Office of Public AffairsSecretary Sebelius announces final rules to support “meaningful use” of electronic health records Press release, July 13, 2010.https://www.cms.gov/apps/media/press/release.asp?Counter=3786&intNumPerPage=10&checkDate=&checkKey=&srchType=1&numDays=3500&srchOpt=0&srchData=&keywordType=All&chkNewsType=1%2C+2%2C+3%2C+4%2C+5&intPage=&showAll=&pYear=&year=&desc=&cboOrder=dateGoogle Scholar The conditions include specific core objectives, along with a menu of additional objectives from which providers must implement at least 5 during the first 2 years. Privacy protection and timely patient access to records are included in the core set, as outlined by National Coordinator for Health IT David Blumenthal, MD, MPP.3Blumenthal D. Tavenner M. The “meaningful use” regulation for electronic health records.N Engl J Med. 2010; 363 (Accessed August 4, 2010): 501-504http://healthcarereform.nejm.org/?p=3732Crossref PubMed Scopus (1483) Google Scholar The Office of the National Coordinator works with Centers for Medicare & Medicaid Services to develop standards and procedures for implementation and certification.With both IT and federal regulations, however, the devil is inevitably in the details of implementation. The core measure for the patient access objective, for example (“More than 50% of requesting patients receive electronic copy within 3 days”), is relatively undemanding and gives no guarantee to an individual patient. And a line buried in the comments section of the meaningful-use statement left privacy advocates4McGraw D. HHS releases rules for electronic health records. Center for Democracy and Technology, July 14, 2010.http://www.cdt.org/blogs/deven-mcgraw/hhs-releases-rules-electronic-health-recordsGoogle Scholar with an impression that Centers for Medicare & Medicaid Services was missing an opportunity to go beyond the Health Insurance Portability and Accountability Act (HIPAA) and modernize the privacy rules: “We do not see meaningful use as an appropriate regulatory tool to impose different, additional, and/or inconsistent privacy and security policy requirements from those policies already required by HIPAA.”5Department of Health and Human Services, Centers for Medicare & Medicaid Services42 CFR Parts 412, 413, 422, and 495 Medicare and Medicaid Programs; Electronic Health Record Incentive Program.Fed Reg. 2010; 75 (44369) (Accessed August 5, 2010)http://edocket.access.gpo.gov/2010/pdf/2010-17207.pdfGoogle ScholarThe Murky Tides of Data and CashAmong the acronymic terms permeating the IT world, the “garbage in, garbage out” (GIGO) processing system requires purposefully structured input data, subjected to scrupulous quality control, to produce useful results. A more recent variant of GIGO, “garbage in, gospel out,” expresses the tendency of technophiles in any position to place inordinate trust in computer-generated output, sometimes for no better reason than its orderly, authoritative appearance. A further permutation foreseeable in the EHR era might invert that formula as “gospel in, garbage out”: information beneficial to patients and clinicians, so long as it is well organized and contained, may become a kind of toxin harming a patient irreversibly if it is contaminated, scrambled, or directed into the wrong hands.All 3 iterations of GIGO threaten to derail the EHR bandwagon. For health IT to fulfill its potential as an aid to clinical practice, patients and physicians need an information flow they can trust: from initial input to all potential outputs, data must be accurate and must end up in the right places. In an EHR-intensive atmosphere, emergency physicians, who may treat a patient only once, will be unusually dependent on the quality of data entered into records by others. Although the time dedicated to record creation as a distinct operation (see parts I and II of this series) may be minimal in emergency department settings, emergency physicians logically have an incentive to be strong advocates for data integrity safeguards.The most appropriate party to control the quality and security of protected health information (PHI), many commentators believe, is the individual patient. If, as another familiar cyberspace slogan has it, “information wants to be free”—ie, tends to get loose—the person most directly affected by its movements has the strongest incentive to oversee it. A few patients have exerted extraordinary effort to manage, protect, and upgrade their own records. “To get good-quality data into a system, you need first of all good rigorous controls on how the data gets in there,” says Dave deBronkart, Jr., whose struggles with both kidney cancer and an error-riddled EHR have made him a nationally recognized spokesman for patients' rights and responsibilities. “Those controls absolutely do not exist in health care.”Patients such as deBronkart have drawn attention to the problem, but to date they are exceptions, swimming against a powerful tide of institutional incentives. The traffic in personal data for targeted marketing, not limited to the medical realm,6Angwin J. The Web's new gold mine: your secrets.Wall Street Journal. July 30, 2010; (Accessed August 4, 2010)http://online.wsj.com/article/NA_WSJ_PUB:SB10001424052748703940904575395073512989404.htmlGoogle Scholar is lucrative; one estimate predicts that the clinical component of the data-mining industry will reach $5 billion by 2020.7Singer N. When 2+2 equals a privacy question.New York Times. October 17, 2009; (BU4) (Accessed August 5, 2010)http://www.nytimes.com/2009/10/18/business/18stream.html?_r=1Google Scholar The appearance of EHR vendor contracts specifying ownership, exclusive access, and sales rights over medical data8Zetter K. Medical records: stored in the cloud, sold on the open market.Wired. October 19, 2009; (Accessed August 5, 2010)http://www.wired.com/threatlevel/2009/10/medicalrecords/Google Scholar implies that some organizations view this information as a profit generator and relegate privacy concerns, with McNealyesque fatalism, to the dustbin of history. One source of irony in that scenario is that advanced technologies and practices capable of protecting PHI—not with absolute certainty, but well enough to establish a trustworthy balance in which EHRs' benefits decisively outweigh their risks—already exist.Paper Records Aren't Airtight EitherSecurity breaches involving PHI, of course, acquired a high profile long before health IT raised the stakes. The 1972 presidential campaign is a case in point: after a leak of Senator Thomas Eagleton's mental health records, amid the strong stigma associated with psychiatric treatments at the time, Democratic candidate Senator George McGovern dropped him as a running mate after 18 days. Ginger McCall, staff counsel with the Electronic Privacy Information Center in Washington, DC, points to the recurrent problem of corruptible hospital employees selling celebrities' medical records to tabloids, the Farrah Fawcett case9Ornstein C. Fawcett's cancer file breached.Los Angeles Times. April 3, 2008; (B1) (Accessed August 3, 2010)http://articles.latimes.com/2008/apr/03/local/me-farrah3Google Scholar being a well-reported recent example. Such temptations are independent of physical formats.Personal privacy in general, one should also note, is a chronically contested legal area. Privacy protection laws covering library information (generally state statutes) have been on the books since the McCarthy era, McCall observes, when Communist-hunting prosecutors used book withdrawal records to imply that certain citizens were ideologically suspect. Some electronic-age privacy laws extend that principle to the federal level, but in unusual, perhaps ungeneralizable factual contexts. “The Video Privacy Protection Act10Wrongful disclosure of video tape rental or sale records. The Video Privacy Protection Act of 1988, 18 US Code § 2710 (2002).Google Scholar was actually passed in reaction to the disclosure of Supreme Court nominees' video rental records,” McCall says. (The specific nominee in question, Robert Bork, had claimed11Bork R. The Tempting of America: The Political Seduction of the Law. Simon and Schuster, New York, NY1990Google Scholar that the Constitution gives no privacy guarantees except for specific rights conferred by legislation, rejecting the interpretation of “penumbral” privacy support that Justice William O. Douglas found throughout multiple Bill of Rights amendments in Griswold v Connecticut.12381 U.S. 479 (1965).Google Scholar Publishing Bork's video records in the Washington City Paper in 1987,13Dolan M. The Bork tapes saga.http://www.theamericanporch.com/bork2.htmGoogle Scholar reporter Michael Dolan claims, was a semischolarly prank turning Bork's own opinion of privacy against him.)Rather than an either/or, pure-or-porous model, it may be more realistic to view PHI privacy as a matter of degrees, contingencies, and tradeoffs. The same patient who prefers to conceal references to past treatment of a sexually transmitted disease during routine office visits may also expect a first responder or emergency physician to have access to a full medication history in a time-sensitive situation in which drug reactions or interactions could amplify hazards. “It's a matter of risk,” says Robert M. Kolodner, MD, Dr. Blumenthal's predecessor as National Coordinator. “I need to understand what the benefit is to me of sharing the information; what the risk is [to] my health if I don't; and … that there is a nonzero risk of it being discovered by somebody, if I'm really worried about that. Many people aren't worried about that, and it comes down to personal choice at this point in time.”As a psychiatrist in the Veterans Administration system since 1978 and a participant in the development of both Veterans Health Information Systems and Technology Architecture and the online personal health record (PHR) system known as My HealtheVet, Dr. Kolodner is now working with the nonprofit group Open Health Tools to develop open-source, cross-platform interfaces and infrastructural elements that different IT vendors can use to “share those costs, those pain points” and streamline operations. Fine-tuning access permission is a critical challenge; too many current EHR systems, he finds, have “an all-or-nothing kind of release that [makes it] hard to slice and dice certain content out of records.”Since financial information has been transmitted electronically for years, some EHR advocates see that field as a case in which technology eventually earned popular trust. “We depend on electronic devices for our entire defense establishment, for almost everything you do, our utilities grids, our whole banking system,” notes David Mechanic, PhD, of the Institute for Health, Health Care Policy, and Aging Research at Rutgers University. “The notion somehow that we can't provide reasonable protections for EHRs, I think, is one I just don't buy into.”“Each of us made our decision when we were ready to have our information flow over the Internet,” says Dr. Kolodner. “Financial information, credit card, or others. And there are still some people who don't put any credit cards on it, and there's some other people who really have no particular concern; they figure they can contain the cost. In health care, it's going to be similar, because there's a cost to not having your information available.” He looks to EHR advances as ways to put more choice and control in patients' hands: “The question is, how do we help the industry to mature to the next stage, where it's really a user-driven industry and not a vendor-driven industry?”Dr. Kolodner notes that paper records in certain respects are less secure than EHRs, at least from unauthorized access to a single record. “I'm not saying it's still this way, but in most hospitals if you put on a white coat, and you act authoritative enough, you can often just walk into a nursing station, or into a record room, and get a record. The other is that you can pay somebody on the staff to get that record, and there's no trace as to how that got released.” The auditing functions of EHRs can track who has obtained access to a record (or tried to), decreasing the likelihood that insiders will agree to participate in such a scheme.Auditability is one of several strategies that EHR developers are using to improve privacy performance. Metadata tagging allows sorting of information into more granular categories; combined with role-based access and content retrieval filtering, this segmentation can let patients designate which providers can view sensitive PHI (genetic, psychiatric, or gynecologic history; information related to sexually transmitted diseases or substance abuse; or conditions the patient believes may affect employment or insurability). For undesignated providers, this information is suppressed—what the psychiatrist and the emergency physician may know, the podiatrist need not—but categories of broad clinical importance such as allergies remain accessible. Anonymization strips personal identifiers out of records for research or biosurveillance applications. In emergency care, a “break-the-glass” function creates an audit trail specifying reasons and conditions, as well as personnel involved, notifying institutional privacy officers about each override request.This past summer, at meetings of the DHHS Privacy and Security Tiger Team,14Privacy & Security Tiger Team: Past MeetingsMaterials from recent Privacy and Security Tiger Team meetings are archived by HHS.http://healthit.hhs.gov/portal/server.pt?open=512&mode=2&objID=2833&PageID=19477Google Scholar a work group organized by the Office of the National Coordinator and representing industry, hospitals, academia, patient organizations, and others concerned with the privacy implications of system design, the technologic feasibility of privacy protection was on public display in fine-grained detail. Firms such as e-MDs, Private Access, Health Information Protection and Associated Technologies, and Tolven Healthcare Innovations demonstrated various consent-management systems designed to interface with health information exchanges in the National Health Information Network, being developed under Office of the National Coordinator guidance since 2008. In some of these systems, rules and metadata for authentication and access control attach to EHRs and PHRs more or less as digital rights management code piggybacks on audio and video files. Pilot projects in the Netherlands and Singapore are troubleshooting systems before testing occurs on larger scales. Speakers recurrently noted that demand for these PHI-protective advances is not yet driving high-volume adoption, but when the US market matures, realistic technologies exist to serve it.Recourse Once the Horses Escape the BarnA layperson observing the Tiger Team sessions could plausibly infer that PHI protection is a readily soluble problem, like the high-level encryption that (usually) guards financial data. Not all types of information, however, behave identically within networked environments. Drawing inferences from other categories to PHI is difficult because PHI breaches can involve unique issues of scalability, irreversibility, and personal consequences that are immeasurable in a literal sense. “If your record with your credit card number gets out,” Electronic Privacy Information Center's McCall says, “you can get another credit card and cancel that one, but if a record with a list of your medications gets out, that's pretty personal information, and you can't get it back.” Calculating financial damages from lost privacy is difficult enough, she notes, that the Video Privacy Protection Act establishes a statutory damage scheme, an approach she believes should also extend to sanctions for unauthorized PHI disclosure. Writing statutory damages into protective laws “sets up a scheme for people to enforce their own rights,” she adds. “One of the things to think about with medical privacy and genetic privacy … is that whenever you have a federal law, it should be a floor and not a ceiling. The federal government, if it creates a law to protect consumers, should also allow the states to create even higher protections for consumers.”Settlements in large class action suits,15Eg, the suit in US District Court for the Northern District of California, San Jose Division, over privacy breaches by the Facebook Beacon feature, Lane et al v Facebook, Inc et al. McCall's organization EPIC is among organizations filing complaints about both the feature, now defunct, and the settlement.Google Scholar McCall comments, are an inadequate remedy for privacy breaches. One recent case involving Facebook and Blockbuster Video, she says, is “a perfect example of why these rights sometimes can't work out, because there you have a couple of plaintiffs' attorneys who have proposed this settlement … where Facebook essentially sets up a $9 million foundation that Facebook helps to run; the attorneys get a lot of money, the named plaintiffs get a little bit of money, and everyone else in the class gets nothing.”Some information scientists also claim that anonymization cannot be foolproof. Working with nonmedical data sets from social networks16Narayanan A. Shmatikov V. De-anonymizing social networks.in: 30th IEEE Symposium on Security and Privacy, 2009: 173-187http://www.cs.utexas.edu/∼shmat/shmat_oak09.pdfGoogle Scholar and film ratings from the Netflix Prize for improving recommendation algorithms,17Narayanan A. Shmatikov V. Robust de-anonymization of large sparse datasets.in: Proceedings of the 2008 IEEE Symposium on Security and Privacy, 2008: 111-125http://userweb.cs.utexas.edu/∼shmat/shmat_oak08netflix.pdfGoogle Scholar Arvind Narayanan and Vitaly Shmatikov at the University of Texas have developed systems capable of reidentifying individuals by matching patterns with publicly posted information, including personal identifiers. According to these authors, such information, though relatively innocuous or trivial in itself, can expose users of the systems to “targeted de-anonymization” (stalking), abusive marketing, phishing, spamming, and surveillance. A fortiori, the implications for the security of PHI, which is distinctly more valuable than the data used in the Texas studies, are alarming.Deborah C. Peel, MD, a psychiatrist who founded the national organization Patient Privacy Rights in 2004, looks askance at current federal PHI protection measures.18Peel D.C. Your medical records aren't secure.Wall Street Journal. March 23, 2010; (Accessed August 5, 2010)http://online.wsj.com/article/SB10001424052748703580904575132111888664060.htmlGoogle Scholar Dr. Peel is no Luddite; she strongly favors smart EHR and PHR systems that include sophisticated granularity and enable patients to customize consent directives. “This is a moment where doctors can really be their patients' advocates,” she says, “because a system where the patient is in control of the data really is a system where the patient can trust the doctor. If not, the doctor is the agent of giant corporations and data thieves and the government, and people are not going to want to see you if you have these bad products.”Recent history, she says, is not encouraging. “The right of consent was eliminated from the [HIPAA] privacy rule in 2002. They finally now say this publicly, but it's not on the DHHS Web site,” she notes. “The Web site's very misleading … . We have pretty much been singlehandedly the ones that have been telling everybody, the privacy rule isn't a privacy rule, it's actually a disclosure rule. It's a data miner's dream, because it lets all of the covered entities decide when to use your information for treatment, payment, or health care operations—not you! And you can't refuse, and you can't object. You can beg them; you have the right to beg the covered entities to not disclose your data. But guess what? They can say ‘No, we're going to do it anyway.'” (Former president George W. Bush, Dr. Peel says, did initially implement the HIPAA rule requiring individual consent for disclosure in 200119Department of Health and Human Services. Standards for privacy of individually identifiable health information. 2001; 65 Fed. Reg. 82,462.Google Scholar; when DHHS reversed this provision in 2002,20Department of Health and Human Services. Final amendments to Federal Privacy Rule. 2002; 67 Fed. Reg. 53182.Google Scholar she recalls, “we never were able to figure out if Bush knew or he didn't know what was going on down below.”) Dr. Peel's group also points out21Patient Privacy RightsZones of privacy (graphic).http://patientprivacyrights.org/media/Zones_of_Privacy.pdfGoogle Scholar that since the Gramm-Leach-Bliley Financial Services Modernization Act22The Gramm-Leach-Bliley Act, a.k.a. the Financial Services Modernization Act of 1999, codified at Pub.L. 106-102, 113 Stat. 1338 (1999).Google Scholar of 1999 broke down the Glass-Steagall firewalls separating the financial, securities, and insurance sectors, the channels through which unprotected and personally destructive information can now silently travel are unprecedentedly broad.At the June 29 Tiger Team session on protective technologies, where Dr. Peel was an invited panelist, moderators used a World Cup–style yellow card warning system to confine discussion to technical details rather than policy. Speaking afterward—having snuck in a late half-minute of comment on who should have decisive authority about privacy controls, drawing a yellow card—Dr. Peel bridled at this limitation of debate and noted that the panel was stacked with industry representatives.In many cases, she says, physician-driven EHR products that allocate decisions to patients are less costly than unreliable, ungranular billing-oriented systems. She finds that most discussion “has been co-opted by those who want records to be totally open and accessible to all, not so that doctors can treat people, but so that these stakeholders can get the records and use them against people to discriminate and all the rest. Most people would freely be open with their physicians if the information didn't leak out and prevent them from getting jobs, promotions, and coverage.”Emergency scenarios, she also states, sometimes appear in anticonsent arguments, with little reference to important contextual facts such as how often patients actually appear in uncommunicative condition. “The data mining industry, including the insurers, all used that issue—that ‘What if you’re unconscious in Alaska, how are you going to get your data?'—to say that's why your data should be open to all doctors all the time. And it's just a damn lie … . I know that as a [resident] physician in the ER, we would violate the privacy of the unconscious person every time to save their lives … it's a matter of ethics; the highest ethic is, save a life. Privacy goes out the window. So I don't believe there's a conflict there at all. The problem is, that issue, ‘Emergency doctors need to know everything if you’re unconscious,' has been used to justify the fact that they never built these systems in accord with our rights to control our information at all. It's a great flag to wave to make it seem like you're going to put yourself in danger unless you let everybody and their dog see your medical records at all times.”Dr. Peel bases her watchdog activities on her observations, in 35 years of psychiatric practice, of patients' dread of PHI disclosure. “People used to say, ‘Oh, Deborah, you're a fear monger; all this stuff is theoretical.' Well, I found, and use in my presentations now, slides from DHHS findings in 2000 that about 60,000 people a year refuse to get early diagnosis and treatment for cancer, because they know that [the information] won't stay private. And about 2 million a year, the same thing for mental illness treatment. And then I also use a slide from a RAND Corporation study that found there's 150,000 Iraqi vets with PTSD, and because there's no privacy [in] mental health treatment for the military, at least active duty, that these soldiers, our current soldiers, don't get treatment, and so as a consequence we have the highest rate of suicide among active-duty military personnel in 30 years. So I'm able to say that, look, this isn't theoretical. These are real problems. People refuse to get treatment when they think it's going to harm them. You shouldn't have to choose between a job and health care, but we do today.”If poor IT choices breed distrust, she cautions, the backlash will hit physicians, not vendors. “People are not thinking about how the technology can be used to improve and strengthen the doctor-patient relationship, and I believe that people are going to get very, very angry at their physicians when they realize that so many of the EHRs sell data … . I mean, how many people are going to be happy when they find out their doctor sold all this sensitive information that could cause generations of discrimination without their knowledge? So I think [with] a lot of the defects of the health IT products that are out there today, the public's never going to go to Kansas City and knock on Neal Patterson's door, the CEO of Cerner, and say ‘What did you do to my life, you jerk?' They're going to go to Dr. Peel, or to you, and say, ‘You destroyed my life!'”Glaring GIGO Even at the Leading Edge“E-Patient Dave” deBronkart, a technology marketing executive in Nashua, NH, who was unexpectedly diagnosed with stage IV, grade 4 renal cell carcinoma in January 2007, found out the hard way that EHRs can be contaminated with surprising amounts and forms of garbage. As a longtime early technology adopter and Internet enthusiast, deBronkart responded to his diagnosis by seeking as much information as he could find, including his own EHRs at Boston's Beth Israel Deaconess Medical Center. He also became active in the online community e-patients.net and the “participatory medicine” movement. The good news in deBronkart's case is the clinical course: given a median survival estimate of 24 weeks at diagnosis, he underwent successful laparoscopic excision, then investigative treatment with high-dosage interleukin-2 in a clinical trial. His last treatment took place in July 2007, and his remaining lesions have continued to shrink. He and his physicians have declared victory.In the course of reinventing his professional life as a prominent e-patient, however, he also made alarming discoveries. Transferring his records from Beth Israel Deaconess to Google Health, a voluntary PHR system, he found a proliferation of howling errors: miscoding, upcoding, misdated reports and alarms, documents not dated at all, and his misidentification as “a 53-year-old woman” on a 2003 lung radiograph (his age, at least, was accurate). Medication for chemotherapy-induced emesis resulted in a record giving deBronkart “anxiety disorder.” An entry for volvulus, entirely fallacious, remains unexplained.A “history of aortic aneurysm” was apparently upcoded from a minor and transient observation. As he recalls, “it turns out, in one of my scans when I was sick, it reported a 1/4-inch enlargement in the base of my aorta. Now, to any thinking physician, that's no big deal, especially since it wasn't present in the next scan. But to a billing clerk, who is encouraged to submit the highest-priced thing they can legitimately charge for, bingo! Enlarged aorta: that qualifies as an aortic aneurysm.”Another dramatic error was a finding of brain and spine metastases. He didn't have them, but his physicians had performed magnetic resonance imaging to rule them out. “There's no way in the billing data to say ‘and it came back negative,'” he discovered; for insurance purposes, what matters is procedures performed, not results. Sins of omission appeared as well, serious ones: there was no medication history at all (even the interleukin-2), and the section on drug sensitivities included no warning about steroids, which would interfere with his lifesaving immune treatment and are thus permanently contraindicated for him.His blog entry about the experience23deBronkart D. Imagine someone had been managing your data, and then you looked The new life of e-patient Dave (blog).http://patientdave.blogspot.com/2009/04/imagine-someone-had-been-managing-your.htmlDate: April 4, 2009Google Scholar led to a Boston Globe feature24Wangsness L. Electronic health records raise doubt.Boston Globe. April 13, 2009; (Accessed August 6, 2010)http://www.boston.com/news/nation/washington/articles/2009/04/13/electronic_health_records_raise_doubt/?page=fullGoogle Scholar and multiple invitations to address physicians, technologists, and others about the problem. At the heart of these errors, he explains, lies the structural and conceptual mismatch between International Classification of Diseases, Ninth Revision billing codes, with their built-in incentives and lack of nuance, and subtler, more precise diagnostic information. “I had no idea this issue existed until I tried to move my data over,” he says, “and then I got schooled pretty quickly by some friends. Billing data is categorically wrong for use as a prox

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call