Abstract

The application of high-throughput DNA sequencing technologies (WGS) data remain an increasingly discussed but vastly unexplored resource in the public health domain of quantitative microbial risk assessment (QMRA). This is due to challenges including high dimensionality of WGS data and heterogeneity of microbial growth phenotype data. This study provides an innovative approach for modeling the impact of population heterogeneity in microbial phenotypic stress response and integrates this into predictive models inputting a high-dimensional WGS data for increased precision exposure assessment using an example of Listeria monocytogenes. Finite mixture models were used to distinguish the number of sub-populations for each of the stress phenotypes, acid, cold, salt and desiccation. Machine learning predictive models were selected from six algorithms by inputting WGS data to predict the sub-population membership of new strains with unknown stress response data. An example QMRA was conducted for cultured milk products using the strains of unknown stress phenotype to illustrate the significance of the findings of this study. Increased resistance to stress conditions leads to increased growth, the likelihood of higher exposure and probability of illness. Neglecting within-species genetic and phenotypic heterogeneity in microbial stress response may over or underestimate microbial exposure and eventual risk during QMRA.

Highlights

  • Microbial risk assessment (MRA) has been adopted as a framework to enable weighing of options for public health protection and mitigation of the impact of exposures to microbial hazards [1,2]

  • The matrix of percent similarity between the 7343 genes in the pangenome and the assembled L. monocytogenes genomes was generated for further use as input for machine learning predictive models

  • We found out that increase in the proportion of tolerant L. monocytogenes resulted in increased association between the estimated number of cases per million and increase in concentration of the pathogen during consumer storage

Read more

Summary

Introduction

Microbial risk assessment (MRA) has been adopted as a framework to enable weighing of options for public health protection and mitigation of the impact of exposures to microbial hazards [1,2]. If the MRA is conducted based on available consumer level food samples, direct assay of microbial concentration is possible at the point of consumption This is often not the case and it becomes expedient to model and project the impact of changes in conditions that may influence growth and inactivation of the microorganisms starting from the concentration determined from foods samples from other farm to fork steps. Lag phase is an adaptation period where bacterial cells adjust to a new environment after which they grow exponentially at the maximum growth rate (μmax) until growth reaches a plateau at the maximum population density referred to as the stationary phase [4] These post-contamination growth or inactivation changes in microbial concentration are influenced by food processing and storage environment conditions such as pH, organic acids, water activity (influenced by desiccation and salt concentration) and temperature. The impact of these conditions on microbial growth and/or inactivation can be described by secondary models (Figure 1) [4]

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call