- Research Article
- 10.3389/fams.2025.1694710
- Dec 9, 2025
- Frontiers in Applied Mathematics and Statistics
- Lauren L Luciani + 4 more
Children are uniquely susceptible to severe influenza infection, with one million children experiencing severe life-threatening disease each year. However, there is little evidence that an underdeveloped immune system or differences in viral loads are responsible, implicating the host inflammatory response as responsible for increased lung injury in juveniles. Here, we used mechanism-based mathematical modeling, age-specific lung immune data from influenza-infected mice, Bayesian statistics, and rigorous Monte Carlo-based methods to identify immune mechanisms that may be differently regulated in juvenile animals. We hypothesized that the immunological mechanisms between juvenile and adult mice are primarily conserved, and that immune response differences arise due to a minimal set of parameter differences. First, we developed and identified parameter bounds for an ordinary differential equation (ODE) model of the innate immune response to influenza infection which capture the dynamic changes of select parameters. Using publicly available juvenile and adult murine data, we then conducted a computational screen of different age-specific model scenarios and evaluated the scenarios using the Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC) to select the optimal scenario. These results suggest that the rate of production of JAK–STAT pathway activators, like type I IFNs and IL-6, is age-specific. Preconditioned Monte Carlo (PMC) analysis revealed that JAK–STAT activator production is higher in juveniles than adults. Additional simulations suggest antiviral therapeutics may be more effective in juvenile populations. While not significantly suppressing virus replication, age-specific IFN or IL-6 production may be responsible for increased inflammation, lung injury, and mortality observed in juvenile influenza infection.
- Research Article
- 10.3389/fams.2025.1692890
- Nov 25, 2025
- Frontiers in Applied Mathematics and Statistics
- Olivier White + 3 more
- Research Article
- 10.3389/fams.2025.1698123
- Nov 25, 2025
- Frontiers in Applied Mathematics and Statistics
- Maeti Antoinette George
Global climate change affects human populations, as well as aquatic and terrestrial ecosystems, highlighting the interconnected nature of the intervention strategies that seek to improve human health. Statistical applications and mathematical modeling are pivotal and crucial for quantifying the future outcomes and understanding the effects of climate change on diseases. The study was intended to identify the extent to which predictive modeling was utilized in Lesotho in relation to the impact of drought on the spread of disease in human beings. This was a systematic review of literature focused on projections and predictions related to impacts on ecosystems and biodiversity in vulnerable communities, and ultimately on human health. Furthermore, the extent of research regarding the utilization of models in an attempt to curb the spread of climate-related diseases and their effectiveness, so that countries can respond promptly, was reviewed. It has been concluded that predictive modeling has not been applied in Lesotho, and this risks crop failure, disease spread, anxiety and mental health problems for the affected communities. If used, statistical and predictive disease modeling and predictions along the interconnected threats brought by drought would enable an understanding of how and when diseases may spread, and how their spread can be controlled. The review recommends that southern African countries should develop predictive models using available hydrological parameters, meteorological and disease data. Decision-makers should also use climate and disease forecasts, provided they are supported by available climate and health data.
- Research Article
- 10.3389/fams.2025.1660916
- Nov 6, 2025
- Frontiers in Applied Mathematics and Statistics
- Shamsul Rijal Muhammad Sabri + 1 more
Modeling income distributions is crucial for understanding inequality and providing evidence-based policy support. A key challenge, however, lies in evaluating the extent to which household income inflates over time. While income is inherently random, it exhibits a persistent upward trend, and fitting income distributions using conventional models often leads to inconsistent parameter estimates. This highlights the necessity of explicitly incorporating inflation-adjusted scaling to preserve proper statistical properties. To address this gap, we introduce the Scale-Inflated Gamma (SIG) distribution, which extends the standard Gamma distribution by including an inflation-adjusted scale parameter (δ), thereby providing greater flexibility in capturing heterogeneous income dynamics. Standard models such as the Lognormal, Pareto, or Generalized Beta of the Second Kind (GB2) systematically underestimate upper-tail incomes and fail to capture inflation-adjusted heterogeneity across subgroups (B40, M40, T20). The SIG model, in contrast, strikes a balance between parsimony and flexibility by directly adjusting for inflationary scale shifts. For instance, while the Gamma distribution underestimates the 95th percentile by 10%–12% in 2019, the SIG model reduces this bias to approximately 3%, accurately reflecting income dynamics across B40, M40, and T20 groups. We develop the theoretical foundations of the SIG distribution by deriving its probability density function (PDF), cumulative distribution function (CDF), and moments. Parameters are initially estimated using the method of moments and then refined through maximum likelihood estimation (MLE). To assess estimator precision, we derive the Fisher information matrix, using the inverse Hessian to approximate the variance–covariance matrix, thus ensuring reliable inference. A Monte Carlo simulation study is conducted to evaluate the consistency and efficiency of the estimators under various sample sizes. The SIG model is subsequently applied to Malaysian Household Income Survey (HIS) data spanning the period from 2007 to 2022. Results demonstrate that the SIG distribution offers a superior fit for modeling income inequality and upper-tail behavior compared to conventional models. Overall, the study establishes the SIG distribution as a theoretically robust and policy-relevant framework for analyzing income patterns in inflation-sensitive and structurally diverse economies.
- Research Article
- 10.3389/fams.2025.1673247
- Nov 4, 2025
- Frontiers in Applied Mathematics and Statistics
- Oksana Pavlenko
To model potential structural shifts in the data that depend on their historical values, different smooth transition autoregressive models are constructed and compared for the changes in the unemployment rate among 15–75-year-old residents of Latvia, including the popular LSTAR, ESTAR, and LSTAR2 models, as well as the recently introduced ASTAR model with an asymmetric transition function. For their estimation, special modifications of the only available function in the tsDyn package of R software for the classical logistic smooth transition autoregressive model (LSTAR) are used. The constructed models are also compared with a linear autoregressive model (AR), an autoregressive model with Generalized Autoregressive Conditional Heteroscedastic (GARCH) errors, and a self-exciting threshold model. The first lag of the dependent variable and the inflation rate are used as threshold variables. LSTAR2 with the first lag as the threshold variable provides the best fit compared to the other constructed models for these data. However, other STAR models may provide a significantly better out-of-sample forecast. Compared to RMSE, the ASTAR out-of-sample forecast performs better on different horizons. Using the inflation rate as an external threshold variable does not improve the model. The study indicates that the new R functions may be useful for economic data analysis.
- Research Article
- 10.3389/fams.2025.1670077
- Nov 4, 2025
- Frontiers in Applied Mathematics and Statistics
- Jacques Demongeot + 3 more
Introduction The epidemic transition that took place in Europe and North America during the twentieth century, with the historical decline of infectious disease epidemics, gradually diverted physicians' attention from the world of “microbes.” However, recent epidemics have made the surveillance of new microorganisms, particularly viruses, in the general population a new public health priority. Methods Most of the highly sophisticated mathematical models currently in use have failed to accurately predict and describe the latest emerging epidemics (mad cow disease, H1N1, swine flu, Covid-19, etc.). Predicting the occurrence of an epidemic remains almost as challenging today as it was in 1760, when D. Bernoulli defined the notion of endemicity and successfully proposed his famous SI equation to describe epidemic dynamics, then applied it to smallpox epidemics. Finally, it might be more interesting to return to the historical, more pragmatic approach, especially in a context of uncertainty, by favoring simpler but robust mathematical models that are more in line with the basic principles governing the interactions of microorganisms with their hosts, in a given environment and exposure conditions. For this reason, we will use the Bernoulli model and the parameters related to the empirical distribution of new daily or weekly cases observed. Results Using the empirical distribution of new cases and the revisited SI model, we have studied the predictive power of the dispersion index of new cases and the applications proposed to illustrate our approach concern the Covid-19 epidemic in various developed and developing countries as well as the Dengue epidemic in the French Antilles. The results obtained show that, except in cases where the occurrence of vaccination reduces its anticipation capacities, the dispersion index has a predictive power of the occurrence of epidemic peaks. Discussion One limitation of this study is that it is based on official data that is sometimes affected by changes in health policies (recommendations, monitoring indicators, data collection methods, etc.), but we believe that the impact on the quality of the demonstration remains moderate or even modest.
- Research Article
- 10.3389/fams.2025.1644869
- Oct 28, 2025
- Frontiers in Applied Mathematics and Statistics
- Martha Takane + 4 more
Synchronous regulated biological networks are often represented as logical diagrams, where the precise interactions between elements remain obscured. Here, we introduce a novel type of excitation-inhibition graph based on Boolean logic, which we term “logical directed graph” or simply, “logical digraph.” Such a logical digraph facilitates the representation of every conceivable regulatory interaction among elements, grounded in Boolean interactions. The logical digraph includes information about connectivity, dynamics, limit cycles, and attractors of the network. As proof of application, we utilized the logical digraph to analyze the operations of the well-known neural network that produces oscillatory swimming in the mollusk Tritonia. Our method enables a seamless transition between a regulatory network and its corresponding logical digraph, and vice versa. Additionally, we demonstrate that the spectral properties of the so-called state matrix provide mathematical evidence explaining why the elements within attractors and limit cycles contain information about the dynamics of the biological system. More specifically, the non-zero entries of the Perron-Frobenius eigenvector of the state matrix indicate the attractors and limit cycles of the network. We demonstrate that each connected component of the regulatory network has exactly one attractor or limit cycle. Open software routines are available for calculating the components of the network, as well as the attractors and limit cycles. This approach opens new possibilities for visualizing and analyzing regulatory networks in biology.
- Research Article
- 10.3389/fams.2025.1650059
- Oct 21, 2025
- Frontiers in Applied Mathematics and Statistics
- Yaqin Liao + 3 more
In longitudinal studies, treatments are often assigned in the form of a sequence to achieve a certain outcome of interest. The blip effect of treatment in sequence is the net effect of treatment on the outcome. In this article, we introduce a method of estimating and testing the blip effects via the standardized point effects of treatments in sequence. First, we apply available methods to estimate the point effects referring to single-point treatments. Then we standardize the point effects to a small number of strata of relevance to the blip effects of interest. Finally, we use the standardized point effects to estimate and test the blip effects. Our method addresses two issues in complex longitudinal studies: a dimension reduction without strict treatment assignment conditions and a targeted analysis of the blip effects of interest across different times. The simulation study shows that our method achieves unbiased estimates of the blip effect, maintains nominal coverage probability, and demonstrates high power for hypothesis testing. A medical example illustrates the application of our method in observational studies.
- Research Article
- 10.3389/fams.2025.1632271
- Oct 20, 2025
- Frontiers in Applied Mathematics and Statistics
- Francesco Cavarretta
Recent advances in computational resources have enabled the development of large-scale, biophysically detailed brain models, which require numerous three-dimensional neuron morphologies exhibiting realistic cell-to-cell variability. However, the limited availability of experimental reconstructions restricts parameter estimation for many morphology synthesis algorithms, which typically rely on extensive datasets. Here, we propose enhancing our branching-and-annihilating random walk method by incorporating a set of mathematical equations that estimate branching and annihilation probabilities directly from Sholl plots and branch point counts. Because these morphological metrics are commonly reported in the literature, our approach facilitates the generation of realistic three-dimensional morphologies even in the absence of experimental reconstructions.
- Research Article
- 10.3389/fams.2025.1640044
- Oct 17, 2025
- Frontiers in Applied Mathematics and Statistics
- Aram Kamal Faraj + 2 more
The fitness-dependent optimizer (FDO) has recently gained attention as an effective metaheuristic for solving different optimization problems. However, it faces limitations in exploitation and convergence speed. To overcome these challenges, this study introduces two enhanced variants: enhancing exploitation through stochastic boundary for FDO (EESB-FDO) and enhancing exploitation through boundary carving for FDO (EEBC-FDO). In addition, the ELFS strategy is proposed to constrain Levy flight steps, ensuring more stable exploration. Experimental results show that these modifications significantly improve the performance of FDO compared to the original version. To evaluate the performance of the EESB-FDO and EEBC-FDO, three primary categories of benchmark test functions were utilized: classical, CEC 2019, and CEC 2022. The assessment was further supported by the application of statistical analysis methods to ensure a comprehensive and rigorous performance evaluation. The performance of the proposed EESB-FDO and EEBC-FDO algorithms was evaluated through comparative analysis with several existing FDO modifications, as well as with other well-established metaheuristic algorithms, including the Arithmetic Optimization Algorithm (AOA), the Learner Performance-Based Behavior Algorithm (LPB), the Whale Optimization Algorithm (WOA), and the Fox-inspired Optimization Algorithm (FOX). The statistical analysis indicated that both EESB-FDO and EEBC-FDO exhibit better performance compared to the aforementioned algorithms. Furthermore, a final evaluation involved applying EESB-FDO and EEBC-FDO to four real-world optimization problems: the gear train design problem, the three-bar truss problem, the pathological igg fraction in the nervous system, and the integrated cyber-physical attack on a manufacturing system. The results demonstrate that both proposed variants significantly outperform both the FDO and the modified fitness-dependent optimizer (MFDO) in solving these complex problems.