The system in the United States for protecting human participants in research engages the earnest efforts of thousands of scientists, community volunteers, and administrators. Through untold hours of service on Institutional Review Boards (IRBs), they watch over the safety of human research subjects. Unfortunately, much of that effort is increasingly misdirected as the system succumbs to “mission creep” that could compromise its central goals. Our IRB system is endangered by excessive paperwork and expanding obligations to oversee work that poses little risk to subjects. The result is that we have simultaneous overregulation and underprotection. IRBs were established after the 1979 Belmont Report from the Department of Health, Education, and Welfare, with the goal of protecting human subjects involved in potentially risky medical and behavioral research. But IRBs' burdens have grown to include studies involving interviews, journalism, secondary use of public-use data, and similar activities that others conduct regularly without oversight. Most of these activities involve minimal risks—surely less than those faced during a standard physical or psychological examination, the metric for everyday risk in the federal regulations. And IRBs are pressured to review an expanding range of issues from research design and conflicts of interest to patient privacy. These are beyond the scope of research protection and are best left to others. The IRB system is being overwhelmed by a focus on procedures and documentation at the expense of thoughtful consideration of the difficult ethical questions surrounding the welfare of human subjects, especially as complex clinical trials burgeon. Their work is afflicted by unclear definitions of terms such as “risk,” “harm,” and “research.” Because ethical behavior is difficult to measure, many IRBs rely on stylized documentation over substantive review, out of concern that one case in a thousand could slip through and generate bad publicity or penalties, or potentially shut down research. The result is that many protocols receive exaggerated review, and the paper piles up. Society loses as potentially productive research is discouraged or self-censored. Ironically, this obsession with paperwork and mechanical monitoring may undermine protection of human subjects. IRB members spend too much time editing documents, marking typos, and asking for more details. One researcher, 10 years into a longitudinal study, was asked by an IRB to remove the term “anemia” from consent forms because participants might not understand it. Such actions, about which we hear frequently, carry a serious risk: They reduce trust in the guidance of IRBs and may alienate some researchers enough to turn them into scofflaws. ![Figure][1] CREDIT: ROYALTY-FREE/CORBIS Oversight of the IRB process by federal agencies reinforces these tendencies. “Poor or missing ‘Standard Operating Procedures'” and “poor minute-keeping” account for about half of all U.S. Food and Drug Administration citations, and quorum failures for another 13%, according to one review. In seeking compliance, universities have multiplied the number of IRBs, depleting the supply of willing and competent faculty. All this has generated a trend in which researchers increasingly think of IRBs as the “ethics police.” In fact, all researchers must take primary responsibility for professional, ethical conduct. Our systems should reinforce that, not work against or substitute for it; the IRB should be a resource, not the source, for ethical wisdom. All compliance systems require the buy-in and collaboration of the regulated, and it will be a sad day if scholars come to see human protection in research as the source of frustrating delays and expensive paperwork. What can be done? Our University of Illinois white paper,[*][2] based on 2 years of study after an interdisciplinary conference of researchers and IRB leaders, addresses the problems of mission creep and offers possible solutions. Our recommendations include the exemption from IRB oversight of some activities that have ethical standards of their own, distinct from the biomedical tradition. We also support gathering information in a national clearinghouse that supports IRBs and researchers alike. This would provide examples of good and poor practices rooted in disciplinary standards, and help IRBs make priority determinations about what constitutes risk and harm in different human research settings. The IRB system is in trouble, and that means trouble for the safety and efficacy of research on human subjects. We should refocus our efforts on the core issues and stop expanding the mission into less productive territory. [1]: pending:yes [2]: #fn-1