Abstract

The evolution of antimicrobial resistance can be strongly affected by variations of antimicrobial concentration. Here, we study the impact of periodic alternations of absence and presence of antimicrobial on resistance evolution in a microbial population, using a stochastic model that includes variations of both population composition and size, and fully incorporates stochastic population extinctions. We show that fast alternations of presence and absence of antimicrobial are inefficient to eradicate the microbial population and strongly favor the establishment of resistance, unless the antimicrobial increases enough the death rate. We further demonstrate that if the period of alternations is longer than a threshold value, the microbial population goes extinct upon the first addition of antimicrobial, if it is not rescued by resistance. We express the probability that the population is eradicated upon the first addition of antimicrobial, assuming rare mutations. Rescue by resistance can happen either if resistant mutants preexist, or if they appear after antimicrobial is added to the environment. Importantly, the latter case is fully prevented by perfect biostatic antimicrobials that completely stop division of sensitive microorganisms. By contrast, we show that the parameter regime where treatment is efficient is larger for biocidal drugs than for biostatic drugs. This sheds light on the respective merits of different antimicrobial modes of action.

Highlights

  • Antibiotics and antivirals allow many major infectious diseases to be treated

  • Under what circumstances are microbial populations eradicated by antimicrobials? when are they rescued by resistance? We address these questions employing a stochastic model that incorporates variations of both population composition and size

  • Faster alternations strongly select for resistance, and are inefficient to eradicate the microbial population, unless the death rate

Read more

Summary

Introduction

Antibiotics and antivirals allow many major infectious diseases to be treated. with the increasing use of antimicrobials, pathogenic microorganisms tend to become resistant to these drugs, which become useless. The evolution of antimicrobial resistance often occurs in a variable environment, as antimicrobial is added and removed from a medium or given periodically to a patient [3, 4]. To address how variations of antimicrobial concentration impact resistance evolution, we investigate theoretically the de novo acquisition of resistance in a microbial population in the presence of alternations of phases of presence and absence of antimicrobial. This situation can represent, for example, a treatment where the concentration within the patient falls under the Minimum Inhibitory Concentration (MIC) between drug intakes [10], which is a realistic case [10, 11]

Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call