Abstract

The German Academy of Sciences—Leopoldina—assessed the state of public health research in Germany (Leopoldina 2015). The highly welcomed report identified the need to strengthen structures, education and research in public health (Anonymous 2015). The range of weaknesses identified for Germany can be generalized to other countries, including neighbouring Switzerland and Austria. Strengthening the infrastructure and funding schemes for public health research is necessary wherever public health sciences have not received the level of attention it gets elsewhere (e.g. the US or UK). Given the importance of the Leopoldina report, IJPH will soon publish invited commentaries. This editorial looks, however, at the bibliometric study Leopoldina commissioned as an input into the main assessment (Donner et al. 2014). The bibliometric study is misleading to a degree that warrants public debate. The bibliometry was used to assess the scientific output (2000–2012) of public health research institutions in Germany. Though public health research adopts a broad range of research methods, epidemiology is very often at the heart of public health sciences. Accordingly, the bibliometric study included both ‘‘public health’’ and ‘‘epidemiology’’. Instead of measuring the scientific output of the German epidemiology and/or public health research community directly, the study defined a list of journals considered to reflect ‘‘public health’’ (N = 156) and/or ‘‘epidemiology’’ (N = 76) and looked at the number of publications therein. However, the list failed to include the highest ranking journals where public health scientists and epidemiologists place their most important work, such as, e.g., the Lancet, NEJM, BMJ, JAMA, Nature Genetics, and the top-ranking disease-oriented medical journals where epidemiologic and/or public health research gets published on a regular basis. Even journals from one of the All Science Journal Classification’s (ASJC) classic public health categories, ‘‘Public, Environmental and Occupational Health’’ were partly omitted, thereby overlooking some of the highest ranking journals. The consequence of the methodological decision is immediately evident in the report summary table, which also appears as an Annex in the full Leopoldina report (Leopoldina 2015), listing the ‘‘top ten’’ most productive German institutions. The highest ranking institution published 154 articles during the 13-year study period, while the 10th placed institution published 73. Public health researchers immediately understand that these numbers are impossibly low, reflecting approximately the output of an average researcher rather than an entire institution. However, scientists and decision makers from other fields may not realize that the report grossly failed to meet its objective. As in all sciences, the methodology is a crucial determinant of failure or success; embarrassed by these misleading numbers, I decided to evaluate the method in three ways. First, as a public health researcher in environmental epidemiology for the last 25 years, I evaluated the list from an environmental health perspective—one of the classic pillars of public health science. Very early in my career, an epidemiologic study published ground-breaking work in N. Kunzli (&) Swiss Tropical and Public Health Institute, Socinstrasse 57, 4002 Basel, Switzerland e-mail: nino.Kuenzli@unibas.ch

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.