Abstract

Over half of all patients admitted to intensive care units (ICUs) receive one or more antimicrobial agents, and most of these are for the treatment of community-acquired and/or nosocomial infections [1]. In a surveillance study, antibiotic use and related costs were prospectively analysed in a general ICU ward over a 1-year period. Antibiotics were prescribed in 61% of admissions. Categorised by indication, 59%of all antibiotic prescriptions were for bacteriologically proven infections, 28% for non-bacteriologically proven infections and 13% for prophylaxis [2]. In the absence of a precise diagnosis of infection, the modern practice of critical care medicine dictates that empirical antibiotic therapy be given [3]. Providing early antimicrobial therapy, which is effective against the microorganisms responsible for infections in hospitalised patients, is recognised as a crucial step for the treatment of acquired infections, along with drainage of infected fluid collections and the debridement or removal of infected tissues or prostheses [4]. Promptly initiating (at the first sign of an infection) appropriate empirical antimicrobial therapy in patients with life-threatening illnesses seems to be the most effective treatment approach [5]. Traditionally empiric treatment has been to start with an inexpensive narrow-spectrum agent, broadening therapy only if a multi-resistant pathogen is identified or the patient deteriorates. This approach may have served to select continuously from a heterogeneous bacterial population, strains that are first-step resistant mutants and, thus, start us on the slippery slope to resistance crises [3].

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call