Articles published on Monte Carlo Simulation
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
209161 Search results
Sort by Recency
- New
- Research Article
- 10.1515/bmt-2025-0491
- Jan 22, 2026
- Biomedizinische Technik. Biomedical engineering
- Ivan R Pavlović + 7 more
This paper presents an experimental numerical method for modeling and analyzing stochastic systems. For this purpose, various machine prediction models are trained using the Monte Carlo simulation method. This method is presented using experimental data of a kidney transplantation with an immunosuppressive protocol based on tacrolimus. A multivariate regression model was constructed by previous authors based on a clinical study in which key independent physiological parameters such as serum creatinine and estimated glomerular filtration rate (eGFR) six months after transplantation, as well as the pharmacokinetics of tacrolimus, including the dose-adjusted trough concentration of tacrolimus (C0/D) and intrastation variability (IPV), and eGFR between 13 and 36 were the dependent variable. Using the Monte Carlo simulation method, this model is further applied to obtain the essential data for the optimization of the prediction models. To determine the optimal prediction model, the DecisionTreeClassifier, Random Forest Classifier, and XGBClassifier were trained and compared. The results indicate that XGBoost is the most accurate, reliable and generalizable model among the classifiers tested, while Monte Carlo simulation represents a significant methodological advance in the field of kidney transplantation. Advanced numerical methods for kidney transplant patients' therapy are step forward in optimization of current immunosuppressive protocols.
- New
- Research Article
- 10.1088/2057-1976/ae3764
- Jan 21, 2026
- Biomedical Physics & Engineering Express
- Héctor M Garnica-Garza
Objective. In photon beam radiotherapy, modern delivery techniques have allowed to substantially reduce the beam energy needed for the safe and efficient irradiation of deep-seated targets, with even Co-60 beams being now able to irradiate targets at any depth. The purpose of this work is to determine if for electron radiotherapy, advanced beam delivery techniques allow the use of beam energies currently available in the clinic to treat target sites usually reserved for photons or very high energy charged particles.Methods. Segmented computed tomography images from three sites, brain, lung and prostate, were used to model radiotherapy treatments in two modalities: conformal 3D and converging small field. Monte Carlo simulation was used to calculate the absorbed dose distribution in each patient for conformal 3D very-high energy plans and converging small-field, low energy (< 50 MeV) electrons. For comparison, converging small field plans for 6 MV x-ray beams were also calculated.Main results. It is shown that , for the three test cases simulated in this work, electrons with energies in the 20-25 MeV range delivered via the converging small-field modality can produce treatment plans that rival those obtained via conformal very high energy electrons in terms of target dose homogeneity and sparing of the organs at risk. Furthermore, such electron plans also compare well to those obtained with the photon beams.Significance. While the consensus has always been that to reach deeper tumors, higher electron energies, in the order of 150-200 MeV are needed, this work shows that this is not the case and, when small, circular electron fields are delivered in a converging manner, energies below 30 MeV are enough to properly irradiate tumors located at relevant radiological depths for a variety of treatment sites.
- New
- Research Article
- 10.1080/09603123.2026.2616406
- Jan 21, 2026
- International journal of environmental health research
- Hamed Soleimani + 8 more
Fluoride is recognized for its dental health benefits; however, excessive intake during infancy may pose risks such as dental fluorosis. This review evaluated fluoride levels in breast milk (BrM), influencing factors, and potential health effects in infants. The methodology involved searching PubMed, Scopus, Web of Science, IranDoc, Science Direct, and Google Scholar for studies published from 1974 to 2025. The search strategy included Medical Subject Headings (MeSH) terms and free-text keywords - such as "fluoride," "breast milk," "breastfeeding," 'human,"and "level," along with other relevant terms - combined with Boolean operators (AND/OR) for a comprehensive literature search. Inclusion criteria: peer-reviewed studies with original data on fluoride in BrM. Exclusion criteria: informal reports, reviews, and studies without primary data. Of the 204 records, 9 studies were included in the final analysis. Lacking raw data, values were re-simulated in Excel (2016) using mean, standard deviation, and sample size. Results showed that the mean fluoride in BrM met the European Food Safety Authority (EFSA) guideline (100 µg/L) in 12 of 19 cases (63.2%), with 36.8% outside this range. The health risk assessment showed that, for one-month-old infants, the hazard quotient (HQ) exceeded the permissible limit (HQ = 1) in 7 of 19 cases (36.8%).
- New
- Research Article
- 10.1108/ec-10-2024-0953
- Jan 21, 2026
- Engineering Computations
- Ibidun Christiana Obagbuwa + 3 more
Purpose To build and design a simulation model to mimic customer behaviour using discrete event simulation. The model will incorporate a multi-channel queueing system. Thus, multi-servers will attend to customers in line. This improves service quality and avoids queues. Also, the simulation model will help identify gaps in the restaurant. Design/methodology/approach The research utilizes Arena simulation software to develop a model that simulates a typical South African restaurant. The model focuses on customer behaviour, operational flow and resource management, including staff and equipment. The simulation logic models customer flow through the restaurant. Upon arrival, customers are either seated immediately or wait in a queue if no seats are available. Once seated, customers place orders, which are passed to the kitchen for preparation. The time taken for each process (e.g. seating, ordering and food preparation) is recorded and analyzed for potential bottlenecks. Findings The use of discrete event simulation (DES) in conjunction with Arena simulation software offers a practical method for improving restaurant operations in South Africa. By modelling customer behaviour and operational processes, restaurant managers can identify critical bottlenecks and implement changes to improve efficiency, especially during peak periods. This study demonstrates that careful attention to staff scheduling, resource allocation and the layout of restaurant processes can greatly enhance customer satisfaction and operational effectiveness. The simulation results can serve as a decision-support tool for restaurant owners to test various strategies without disrupting actual operations. Research limitations/implications Further research could expand on this work by incorporating machine learning to predict customer behaviour trends or develop more detailed models that account for external factors such as load shedding, economic variability, or seasonal changes in customer patterns. Moreover, more optimization algorithms will be explored to identify an optimal solution to improve service quality to customers. Also, the Monte Carlo simulation will be included and embedded within a discrete event simulation. Furthermore, the future of DES in restaurant simulations lies in improving the realism of human behaviour models and integrating new technologies such as integration with artificial intelligence, real-time simulation and decision support and hybrid simulation models. Practical implications The application of DES in modelling human behaviour in restaurants offers significant advantages in operational management. As the restaurant industry continues to evolve, leveraging such simulation techniques will be crucial for maintaining competitiveness and improving customer experiences. The interaction of people at a restaurant, from when a customer arrives and places an order until they receive their food, was modelled using discrete event simulation. The simulation can be used to assess the restaurant’s performance, better comprehend the situation, and assess any new enhancements. The simulation’s overall effectiveness can be predicted using the model, along with the potential duration of an order and the impact of system changes. Social implications The baseline scenario revealed that customer wait times were acceptable during off-peak hours but significantly increased during peak times. In the peak hour scenario, customer abandonment rates (customers leaving without service due to long wait times) rose, suggesting that restaurants should implement dynamic staffing strategies to manage busy periods. The staff shortage scenario highlighted the critical role of adequate staffing in maintaining service quality, as both wait times and staff utilization were adversely impacted. Originality/value To the best of our knowledge, this is the first research on modelling human behaviour using discrete event simulation for South African restaurants.
- New
- Research Article
- 10.1515/phys-2025-0264
- Jan 21, 2026
- Open Physics
- Minggao Li
Abstract A lattice gas model with short-range interactions is considered in this paper. Through Monte Carlo simulations, it was found that the system shows glassy relaxation behaviors. Moreover, an exotic hexagonal tiling order emerges at low temperatures. Similar real-space configurations have been observed in an alloy before. The novel order also brings about novel percolations. The order and the percolations have profound effects on the dynamics of the system and are responsible for the dynamic heterogeneity.
- New
- Research Article
- 10.1002/rar2.70152
- Jan 21, 2026
- Rare Metals
- Fan Xue + 5 more
ABSTRACT Precise control of chemical ordering in bimetallic nanocatalysts offers a route to decouple activity and selectivity in acetylene hydrogenation, but direct atomic‐scale evidence linking surface order to catalytic performance is scarce. Here, we tune Pd‐Cu nanoalloys from chemically disordered to highly ordered states by composition control and targeted H 2 thermal treatment. Under near‐industrial conditions, all catalysts reach full acetylene conversion between 100°C and 120°C, whereas the ethylene selectivity enhances with the chemical ordering increasing. By combining synchrotron X‐ray absorption fine structure (XAFS) with X‐ray total scattering and reverse Monte Carlo (RMC) simulation, three‐dimensional atomic models are reconstructed to quantitatively map surface coordination. Compared with the chemically disordered Pd‐Cu nanocatalyst, the highly ordered PdCu shows an increase in average surface Pd‐Cu coordination and an expansion of mean surface Pd‐Pd separations. These structural features weaken ethylene binding and suppress re‐adsorption, rationalizing the enhanced selectivity. Our study provides direct atomic‐scale evidence connecting chemical order to catalytic selectivity and opens the way for guiding the precise design of bimetal nanocatalysts.
- New
- Research Article
- 10.1038/s41598-026-36558-3
- Jan 21, 2026
- Scientific reports
- Thomas G Lucas + 4 more
Intrabeam scattering (IBS) is a fundamental effect that can limit the performance of high-brightness electron machines but has so far been neglected in standard modeling of RF photoinjectors. Recent measurements at SwissFEL show that the slice energy spread (SES) in the injector is significantly underestimated in standard beam dynamic simulations. In this paper, we employ a dedicated Monte Carlo simulation model that accurately predicts IBS-induced SES growth in the photoinjector of an X-ray free-electron laser. The simulations are benchmarked against SES measurements at the SwissFEL and are supported by a new analytical model. The results show that IBS-induced SES growth occurs throughout the injector, most prominently in the electron source, and must be included in performance assessments. We further demonstrate that while 5D brightness is largely conserved, the 6D brightness degrades with propagation, highlighting the need to account for IBS in accurate photoinjector design and optimization.
- New
- Research Article
- 10.3847/1538-4357/ae25ea
- Jan 20, 2026
- The Astrophysical Journal
- Ming-Xuan Lu + 3 more
Abstract Type IIn supernovae (SNe IIn) are a subclass of core-collapse SNe in which strong interactions occur between the ejecta and dense circumstellar material, creating ideal conditions for the production of high-energy neutrinos. This makes them promising candidate sources of neutrinos. In this work, we conduct an association study between 163 SNe IIn observed by the Zwicky Transient Facility and 138 neutrino alert events detected by the IceCube Neutrino Observatory. After excluding alerts with poor localization, we find two SNe that are spatiotemporally coincident with neutrino events. IC 231027A and IC 250421A coincide with the positions of SN 2023syz and SN 2025cbj, respectively, within their localization uncertainties, and the neutrino arrival times are delayed by 38 days and 61 days relative to the discovery times of the corresponding SNe. Using Monte Carlo simulations, we estimate that the probability of two such events occurring by chance in our sample is p ∼ 0.67%, suggesting that they may originate from genuine physical associations, though the result is not yet statistically significant. Our model calculations, however, indicate that the likelihood of a neutrino originating from IC 231027A is low, implying that the association between IC 231027A and SN 2023syz is likely coincidental. Nevertheless, under optimistic parameters, the probability of detecting a neutrino from the whole SNe IIn sample could reach ≳6%, indicating that detecting neutrino emission from the SNe population may be possible. Our study provides a systematic analysis, combining statistical analysis and model calculations, to assess whether interacting supernovae can serve as potential sources of neutrino emission.
- New
- Research Article
- 10.57041/vhsfkx67
- Jan 20, 2026
- International Journal of Emerging Engineering and Technology
- Syeda Iqra Shakeel + 6 more
Lung fibrosis is a chronic and progressive illness where there is pathologic tissue scarring, which affects lung architecture and respiratory organ functioning. It is caused by several factors, including idiopathic pulmonary fibrosis (IPF), radiation-related injury, tumour-related fibrosis, and post-COVID-19 complications, and all of them are associated with similar pathophysiology. This review investigates and summarises studies published from 2015 to 2025 on biological processes, clinical symptoms, and technological innovations in the diagnosis and monitoring of lung fibrosis. It emphasises the way that artificial intelligence (AI) and deep learning (DL) models, such as quantitative computed tomography (CT) and convolutional neural networks (CNNs), have enhanced the process of early detection, disease classification, and medical progression forecasting. The computational models, such as the agent-based and Monte Carlo simulations, which are used to study fibrotic dynamics, are also discussed in the review. In general, the combination of molecular knowledge, imaging, and AI-based systems can be considered a major next step in the creation of personalized diagnoses and better treatment outcomes in chronic fibrotic lung diseases.
- New
- Research Article
- 10.3390/jmse14020211
- Jan 20, 2026
- Journal of Marine Science and Engineering
- Odai R Bani Hani + 4 more
Efficient control of wave energy converters (WECs) is crucial for maximizing energy capture and reducing the Levelized Cost of Energy (LCoE). In this study, we employ a deep reinforcement learning (DRL) framework based on the Soft Actor-Critic (SAC) and Deep Deterministic Policy Gradient (DDPG) algorithms for WEC control. Our approach leverages a novel decoupled co-simulation architecture, training agents episodically in MATLAB to export a robust policy within the WEC-Sim environment. Furthermore, we utilize a rigorous benchmarking protocol to compare the SAC and DDPG agents against a classical Bang-Singular-Bang (BSB) optimal control benchmark. Evaluation under realistic, irregular Pierson-Moskowitz sea states demonstrates that the performance of the RL agents is very close to that of the BSB optimal control baseline. Monte Carlo simulations show that both the DDPG and SAC agents can perform even better than the BSB when the model of the BSB is different from the simulation environment.
- New
- Research Article
- 10.1088/1361-6560/ae3b02
- Jan 20, 2026
- Physics in medicine and biology
- Han Gyu Kang + 3 more
For rodent brain PET imaging, spatial resolution is the most important factor for identifying small brain structures. Previously, we developed a submillimeter resolution PET scanner with 1 mm crystal pitch using 3-layer depth-of-interaction (DOI) detectors. However, the spatial resolution was over 0.5 mm due to a relatively large crystal pitch and an unoptimized crystal layer design. Here we use GATE Monte Carlo simulations to design and optimize a sub-0.5 mm resolution PET scanner with 3-layer DOI detectors. 
Methods: The proposed PET scanner has 2 rings, each of which has 16 DOI detectors, resulting in a 23.4 mm axial coverage. Each DOI detector has 3-layer LYSO crystal arrays with a 0.8 mm crystal pitch. We employed GATE Monte Carlo simulations to optimize three crystal layer designs, A (4+4+7 mm), B (3+4+4 mm), and C (3+3+5 mm). Spatial resolution and imaging performance were evaluated with a point source and resolution phantom using analytical and iterative algorithms. 
Main Results: Among the three designs, design C provided the most uniform spatial resolution up to the radial offset of 15 mm. The 0.45 mm diameter rod structures were resolved clearly with design C using the iterative algorithm. The GATE simulation results agreed with the experimental data in terms of radial resolution except at the radial offset of 15 mm.
Significance: We optimized the crystal layer design of the mouse brain PET scanner with GATE simulations, thereby achieving sub-0.5 mm resolution in the resolution phantom study.
- New
- Research Article
- 10.1111/bmsp.70028
- Jan 20, 2026
- The British journal of mathematical and statistical psychology
- Chen-Wei Liu
Hidden Markov diagnostic classification models capture how students' cognitive attributes evolve over time. This paper introduces a Bayesian Markov chain Monte Carlo algorithm for diagnostic classification models that jointly estimates time-varying Q matrices, latent attributes, item parameters, attribute class proportions and transition matrices across multiple occasions. Using the R package hmdcm developed for this study, Monte Carlo simulations demonstrate accurate parameter recovery, and an empirical probability-concept assessment confirmed the algorithm's ability to trace attribute trajectories, supporting its value for longitudinal diagnostic classification in both research and instructional practice.
- New
- Research Article
- 10.1038/s42004-025-01837-z
- Jan 20, 2026
- Communications chemistry
- Quentin Peter + 7 more
Alzheimer's disease (AD) is marked by the abnormal aggregation of amyloid-beta peptides within the central nervous system. The formation of amyloid fibrils from amyloid-beta peptides is a hallmark of AD Here, we demonstrate that the aggregation of amyloid-beta 42 spreads both spatially and temporally. By measuring the spatial propagation of amyloid-beta in macroscopic capillaries and performing Monte Carlo simulations, we show that this spreading occurs through a diffusion mechanism involving oligomers in solution. These species, catalytically produced throughspontaneous secondary nucleation, significantly accelerate the propagation velocity of the reaction wavefront. Our findings suggest that, in addition to their potential role in toxicity, these oligomers in solution are key drivers of the spatial spreading of aggregation and can therefore be considered key targets for therapeutic intervention.
- New
- Research Article
- 10.70609/g-tech.v10i1.9016
- Jan 20, 2026
- G-Tech: Jurnal Teknologi Terapan
- Kaka Davi Dharmawan + 5 more
Gambler's Ruin Problem (GRP) is a basic stochastic model used to analyze the probability of a player reaching a target wealth versus losing all capital through sequential random games. This study has two main objectives: to validate the Monte Carlo simulation model against established theoretical results and to conduct a comprehensive sensitivity analysis of the probability of bankruptcy and game duration relative to the probability of winning , initial capital , and target capital . The simulation model was developed in Python as a one-dimensional random walk model, using replications for initial validation. The results show a high degree of conformity, with an empirical simulation probability of 0.5004 compared to a theoretical value of 0.5000, in accordance with the Law of Large Numbers. Sensitivity analysis shows that a small deviation in (e.g., drastically increases the probability of bankruptcy to over 88%. Furthermore, the average game duration peaks in the fair scenario at 2,502 steps and decreases significantly under biased conditions. This study confirms the effectiveness of the Monte Carlo method in measuring the impact of the “house advantage” and provides counterintuitive insights into the dynamics of stochastic games.
- New
- Research Article
- 10.3390/axioms15010071
- Jan 20, 2026
- Axioms
- Konrad Kułakowski + 1 more
Group decision-making sometimes involves evaluating the decision-makers themselves, e.g., selecting the best expert or assigning rewards within a team. In such cases, all participants should be involved, but their influence should reflect their competence or contribution. This article proposes two new opinion aggregation models in which a person’s assessment weight depends on their ranking, preventing low-performing members from exerting the same influence and promoting respected experts. The proposed aggregation methods uphold the principle of distributive justice by ensuring that individual contributions are proportional to the rewards they receive. In addition to formulating new methods for aggregating results, we presented several of their formal properties and indicated practical ways to calculate the results. For one of the methods, which is more challenging to compute, we conducted a Monte Carlo experiment demonstrating the practical feasibility of computing the aggregated weight vector.
- New
- Research Article
- 10.1080/02664763.2026.2616862
- Jan 20, 2026
- Journal of Applied Statistics
- Germán Ibacache-Pulgar + 3 more
In recent years, semi-parametric modeling has proven successful in describing phenomena that necessitate the modeling of non-linear structures using both parametric and nonparametric components. Partially linear varying coefficient models have emerged as a valuable alternative for modeling the effect of nonlinear interactions between a response variable and a set of covariates across diverse phenomena. In this work, we propose a novel statistical model based on the reparameterized Birnbaum-Saunders distribution, where the systematic component allows the regression coefficients to vary smoothly with respect to certain covariates. To obtain the maximum penalized likelihood estimates of the model parameters, we propose Fisher scoring and weighted backfitting algorithms based on linear spline smoothing. We conduct residual and local influence analyses to assess the potential impact of individual observations on the model fit. Finally, we present a simulation study based on Monte Carlo experiments for evaluate de maximum penalized likelihood estimators, and provide an application of the proposed model to a real-world air pollution dataset, demonstrating its effectiveness in modeling real-world phenomena. The model has been fully implemented in the R programming language.
- New
- Research Article
- 10.3847/1538-3881/ae2ea6
- Jan 19, 2026
- The Astronomical Journal
- Anton Pomazan + 5 more
Abstract 3I/ATLAS is the third interstellar object (ISO) identified to date. Unlike other known ISOs, 3I/ATLAS travels on a nearly ecliptic, retrograde orbit ( i ≈ 175°). This trajectory, combined with a small perihelion distance ( q ≈ 1.36 au), directs it through the relatively dense populated regions of the main-belt asteroids (MBAs) and near-Earth asteroids (NEAs), moving counter to their general prograde motion. This work focused on an investigation of close encounters and assessment of collision probabilities between 3I/ATLAS and the populations of MBAs/NEAs. The N -body numerical integration was performed to identify close approaches within distance ≤0.03 au on the time range from 2025 August 1 till 2026 April 1 (3I/ATLAS’s heliocentric distance exceeded the Jupiter’s orbit). The Monte Carlo (MC) simulations were used to estimate collision probabilities, accounting to the orbital uncertainties of asteroids which have the smallest approach distances. Performed search identified 736 MBAs and 31 NEAs that will have a close approach with 3I/ATLAS at a physical distance of ≤0.03 au. While no direct collisions are predicted based on nominal orbits, our analysis highlights the case of MBA 2020 BG107. Due to its high orbital uncertainty, its 3 σ positional ellipsoid at the time of the encounter is larger than the nominal approach distance to 3I. The collision probability is estimated at 0.025% within our MC simulations. The results could be extrapolated in regard to the solar system comets from Oort cloud on parabolic or hyperbolic orbits, as dynamically (except the velocity) they are similar to the interstellar objects.
- New
- Research Article
- 10.1007/s00234-025-03893-7
- Jan 19, 2026
- Neuroradiology
- Daniel Rosok + 13 more
In emergency diagnostics, head CT and CT angiography (CTA) of craniocervical vasculature are indispensable for children, despite their increased radiation sensitivity. This study assesses the radiation dose metrics of head CT and CTA in pediatric patients managed in the trauma resuscitation unit (TRU). All patients aged 0-<15 years who underwent head CT and CTA in the TRU between April 2020 and August 2023 were included. CT dose index volume (CTDIvol) and dose-length product (DLP) were extracted from the Radimetrics Enterprise Platform, which also provided organ doses estimated via Monte Carlo simulations and effective doses (ED) derived from these estimates. Dose metrics were compared with national diagnostic reference levels (DRLs), defined for three pediatric age groups: I (0-<5 years), II (5-<10 years), and III (10-<15 years). Of 212 pediatric TRU patients, 62.7% (133/212) underwent CT and 72.2% (96/133) received combined CT and CTA. Median CTDIvol and DLP increased with age, whereas ED decreased. For head CT, CTDIvol ranged from 18.9 mGy to 29.4 mGy, DLP from 282 to 460 mGycm, and ED from 1.6 to 1.3 mSv. For CTA, CTDIvol ranged from 1.4 to 2.2 mGy, DLP from 40 to 83 mGycm, and ED from 1.0 to 0.8 mSv. All doses remained below national DRLs. Head CT and CTA in pediatric trauma can be performed with radiation doses well below national DRLs. Careful dose management is important to reduce potential long-term cancer risks and deterministic effects such as lens cataract formation.
- New
- Research Article
- 10.1080/15440478.2026.2615641
- Jan 19, 2026
- Journal of Natural Fibers
- Berfin Gül + 3 more
ABSTRACT As global climate commitments intensify, the textile industry faces growing pressure to quantify and reduce greenhouse gas (GHG) emissions. This study presents a carbon footprint assessment of yarn manufacturing at Ulusoy Textile, following ISO 14064 and an extended five-scope approach: Scope 1 – direct emissions, Scope 2 – indirect emissions from purchased energy, Scope 3 – indirect emissions from transportation, Scope 4 – emissions from products used by the organization, and Scope 5 – emissions and removals from product use. Using 2023 data on raw material procurement, energy consumption, logistics, and facility activities, total emissions were 30,146.80 tCO2e, equivalent to 40.19 tCO2e per employee and 0.003 tCO2e per ton of yarn. Energy use and raw material sourcing were the main contributors, while efficiency measures and renewable energy reduced emissions, and product use provided balancing effects. Uncertainty was assessed via Monte Carlo simulations, and materiality analysis identified key parameters for inventory robustness and verification. These findings establish a verification-ready baseline for Ulusoy Textile’s decarbonization strategy and propose a scalable ISO 14064-aligned framework for yarn producers seeking EU Green Deal and Carbon Border Adjustment Mechanism compliance, supporting innovations in circularity, energy efficiency, and net-zero pathways.
- New
- Research Article
- 10.1111/2041-210x.70240
- Jan 19, 2026
- Methods in Ecology and Evolution
- Shu Xie + 2 more
Abstract State‐dependent speciation and extinction (SSE) models are a popular framework for quantifying whether species traits have an impact on evolutionary rates and how this shapes the variation in species richness among clades in a phylogeny. However, SSE models are becoming increasingly complex, limiting the application of likelihood‐based inference methods. Approximate Bayesian computation (ABC), a likelihood‐free approach, is a potentially powerful alternative for estimating parameters. Here, we develop an ABC framework to estimate state‐dependent speciation, extinction and transition rates from phylogenetic trees in BiSSE (binary state dependent speciation and extinction), GeoSSE (geographic state dependent speciation and extinction) and MuSSE (multiple state‐dependent speciation and extinction) models. Using different sets of candidate summary statistics, we then compare the inference ability of ABC with that of using likelihood‐based maximum likelihood (ML) and Markov chain Monte Carlo (MCMC) methods to identify the combinations that best capture the complex relationships between rates of diversification and species traits. Our results show the ABC algorithm can accurately estimate state‐dependent diversification rates for most of the model parameter sets we explored. The inference error of the parameters associated with the species‐poor state is larger with ABC than in the likelihood estimations only when the speciation rate ( λ ) is highly asymmetric between states in all three models. We suggest that the combination of normalized lineage‐through‐time (nLTT) statistics and phylogenetic signal constitutes efficient summary statistics for the ABC method. By providing an efficient algorithm and a set of suitable summary statistics, our work aims to contribute to the use of the ABC approach in the development of complex SSE models, for which a likelihood is not available.