Probabilistic coastal risk mapping under sea-level rise: A Monte Carlo framework for dynamic exposure hotspots.

  • Abstract
  • Literature Map
  • References
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

Probabilistic coastal risk mapping under sea-level rise: A Monte Carlo framework for dynamic exposure hotspots.

ReferencesShowing 10 of 116 papers
  • Cite Count Icon 28
  • 10.1007/s12594-018-1072-x
Coastal Morphology and Long-term Shoreline Changes along the Southwest Coast of India
  • Nov 1, 2018
  • Journal of the Geological Society of India
  • L Sheela Nair + 3 more

  • Cite Count Icon 30
  • 10.1029/2023ef003713
Enabling Climate Change Adaptation in Coastal Systems: A Systematic Literature Review
  • Aug 1, 2023
  • Earth's Future
  • David Cabana + 3 more

  • Open Access Icon
  • Cite Count Icon 45
  • 10.1038/s41598-022-15237-z
Integrated socio-environmental vulnerability assessment of coastal hazards using data-driven and multi-criteria analysis approaches
  • Jul 8, 2022
  • Scientific Reports
  • Ahad Hasan Tanim + 2 more

  • Cite Count Icon 1
  • 10.1007/s12237-024-01363-6
Enhancing Assessments of Coastal Wetland Migration Potential with Sea-level Rise: Accounting for Uncertainty in Elevation Data, Tidal Data, and Future Water Levels
  • Jun 3, 2024
  • Estuaries and Coasts
  • Nicholas M Enwright + 9 more

  • Cite Count Icon 8
  • 10.1016/j.ijdrr.2021.102130
Evaluating social vulnerability of people inhabiting a tropical coast in Kerala, south west coast of India
  • Feb 17, 2021
  • International Journal of Disaster Risk Reduction
  • J Shaji

  • Open Access Icon
  • Cite Count Icon 14
  • 10.1016/j.gloenvcha.2024.102885
Defining and conceptualizing equity and justice in climate adaptation
  • Jul 1, 2024
  • Global Environmental Change
  • S.E Walker + 8 more

  • Cite Count Icon 60
  • 10.1016/j.ijdrr.2021.102183
Livelihood vulnerability and adaptability of coastal communities to extreme drought and salinity intrusion in the Vietnamese Mekong Delta
  • Mar 20, 2021
  • International Journal of Disaster Risk Reduction
  • Dung Duc Tran + 4 more

  • Open Access Icon
  • Cite Count Icon 19
  • 10.1016/j.marpol.2019.02.028
Bridging climate science, law, and policy to advance coastal adaptation planning
  • Mar 13, 2019
  • Marine Policy
  • J Reiblich + 4 more

  • Cite Count Icon 6
  • 10.1080/01490419.2023.2285944
Coastal vulnerability assessment along the coast of Kerala, India, based on physical, geological, and socio-economic parameters
  • Dec 11, 2023
  • Marine Geodesy
  • Saikrishnan K + 2 more

  • Cite Count Icon 31
  • 10.1016/j.ocecoaman.2023.106487
Nature as a solution for shoreline protection against coastal risks associated with ongoing sea-level rise
  • Jan 19, 2023
  • Ocean & Coastal Management
  • Stella Manes + 7 more

Similar Papers
  • PDF Download Icon
  • Research Article
  • Cite Count Icon 74
  • 10.1007/s00477-016-1377-5
Dealing with hurricane surge flooding in a changing environment: part I. Risk assessment considering storm climatology change, sea level rise, and coastal development
  • Feb 21, 2017
  • Stochastic Environmental Research and Risk Assessment
  • Ning Lin + 1 more

Coastal flood risk will likely increase in the future due to urban development, sea-level rise, and potential change of storm surge climatology, but the latter has seldom been considered in flood risk analysis. We propose an integrated dynamic risk analysis for flooding task (iDraft) framework to assess coastal flood risk at regional scales, considering integrated dynamic effects of storm climatology change, sea-level rise, and coastal development. The framework is composed of two components: a modeling scheme to collect and combine necessary physical information and a formal, Poisson-based theoretical scheme to derive various risk measures of interest. Time-varying risk metrics such as the return period of various damage levels and the mean and variance of annual damage are derived analytically. The mean of the present value of future losses (PVL) is also obtained analytically in three ways. Monte Carlo (MC) methods are then developed to estimate these risk metrics and also the probability distribution of PVL. The analytical and MC methods are theoretically and numerically consistent. A case study is performed for New York City (NYC). It is found that the impact of population growth and coastal development on future flood risk is relatively small for NYC, sea-level rise will significantly increase the damage risk, and storm climatology change can also increase the risk and uncertainty. The joint effect of all three dynamic factors is possibly a dramatic increase of the risk over the twenty-first century and a significant shift of the probability distribution of the PVL towards high values. In a companion paper (Part II), we extend the iDraft to perform probabilistic benefit-cost analysis for various flood mitigation strategies proposed for NYC to avert the potential impact of climate change.

  • Research Article
  • Cite Count Icon 7
  • 10.1108/jcp-02-2015-0008
An introduction to Monte Carlo simulations in criminal psychology: applications in evaluating biased estimators for recidivism
  • May 5, 2015
  • Journal of Criminal Psychology
  • Priscillia Hunt + 1 more

Purpose– Studies in criminal psychology are inevitably undertaken in a context of uncertainty. One class of methods addressing such uncertainties is Monte Carlo (MC) simulation. The purpose of this paper is to provide an introduction to MC simulation for representing uncertainty and focusses on likely uses in studies of criminology and psychology. In addition to describing the method and providing a step-by-step guide to implementing a MC simulation, this paper provides examples using the Fragile Families and Child Wellbeing Survey data. Results show MC simulations can be a useful technique to test biased estimators and to evaluate the effect of bias on power for statistical tests.Design/methodology/approach– After describing MC simulation methods in detail, this paper provides a step-by-step guide to conducting a simulation. Then, a series of examples are provided. First, the authors present a brief example of how to generate data using MC simulation and the implications of alternative probability distribution assumptions. The second example uses actual data to evaluate the impact that omitted variable bias can have on least squares estimators. A third example evaluates the impact this form of heteroskedasticity can have on the power of statistical tests.Findings– This study shows MC simulated variable means are very similar to the actual data, but the standard deviations are considerably less in MC simulation-generated data. Using actual data on criminal convictions and income of fathers, the authors demonstrate the impact of omitted variable bias on the standard errors of the least squares estimator. Lastly, the authors show thep-values are systematically larger and the rejection frequencies correspondingly smaller in heteroskedastic error models compared to a model with homoskedastic errors.Originality/value– The aim of this paper is to provide a better understanding of what MC simulation methods are and what can be achieved with them. A key value of this paper is that the authors focus on understanding the concepts of MC simulation for researchers of statistics and psychology in particular. Furthermore, the authors provide a step-by-step description of the MC simulation approach and provide examples using real survey data on criminal convictions and economic characteristics of fathers in large US cities.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 20
  • 10.1016/j.cie.2023.109261
Explainable machine learning for project management control
  • Apr 23, 2023
  • Computers & Industrial Engineering
  • José Ignacio Santos + 3 more

Project control is a crucial phase within project management aimed at ensuring —in an integrated manner— that the project objectives are met according to plan. Earned Value Management —along with its various refinements— is the most popular and widespread method for top-down project control. For project control under uncertainty, Monte Carlo simulation and statistical/machine learning models extend the earned value framework by allowing the analysis of deviations, expected times and costs during project progress. Recent advances in explainable machine learning, in particular attribution methods based on Shapley values, can be used to link project control to activity properties, facilitating the interpretation of interrelations between activity characteristics and control objectives. This work proposes a new methodology that adds an explainability layer based on SHAP —Shapley Additive exPlanations— to different machine learning models fitted to Monte Carlo simulations of the project network during tracking control points. Specifically, our method allows for both prospective and retrospective analyses, which have different utilities: forward analysis helps to identify key relationships between the different tasks and the desired outcomes, thus being useful to make execution/replanning decisions; and backward analysis serves to identify the causes of project status during project progress. Furthermore, this method is general, model-agnostic and provides quantifiable and easily interpretable information, hence constituting a valuable tool for project control in uncertain environments.

  • Research Article
  • Cite Count Icon 1
  • 10.7498/aps.65.077702
Numerical extraction of electric field distribution from thermal pulse method based on Monte Carlo simulation
  • Jan 1, 2016
  • Acta Physica Sinica
  • Liang Ming-Hui + 3 more

Thermal-pulse method is a powerful tool for measuring space charge distributions in polymer films. The data analysis for thermal-pulse method involves the Fredholm integral equation of the first kind, which requires an appropriate numerical procedure to obtain a solution. Various numerical techniques, including scale transformation and regulation method, are proposed. Of those numerical methods, the scale transformation (ST) is the simplest and the most widely used method. However, it presents a high spatial resolution only near the sample surface. Monte Carlo (MC) method is one of the recently proposed ways to solve the equation numerically and has been successfully applied to the analysis of laser intensity modulation method data, which also involves the Fredholm integral equation of the first kind. In this paper we attempt to analyze thermal-pulse data in frequency domain with the MC method and discuss its effectiveness based on some numerical simulations. The simulation results indicate that the electric field profiles can be effectively extracted by the MC method. The computed profiles by the MC method consist well with the supposed distributions in the entire thickness of the sample, while the profiles reconstructed by the ST method fit very well to the supposed one at the vicinity of the target surface and distort sharply along the direction of the thermal pulse propagation in the sample bulk. On the other hand, the oscillations in the computed results by the MC method could deteriorate its accuracy in this study. The influence of noise level on the analysis based on the MC method is also tested by the use of the simulated data. The results show that the computed profiles would become more fluctuant as the noise level increases. This problem can be solved by selecting a larger value of tolerance during the singular value decomposition procedure. Thus, the value of tolerance is considered to be one of the key parameters in this algorithm, which is actually hard to determine. Additionally, the experimental data obtained from a polypropylene film under applied electric field are analyzed to illustrate the feasibility of MC method to be applied to the thermal-pulse experimental data. The results also show that the spatial accuracy by the MC method in the entire sample thickness is higher than by the ST method, which verifies that the MC method is more suitable for detecting the electric field distribution in the deep bulk of the sample. Owing to noise and error, the accuracy of MC calculation depends on the chosen tolerance value, which is now considered to be an obstacle in applying this method to the practical thermal-pulse measurement.

  • Book Chapter
  • Cite Count Icon 1
  • 10.5772/15933
Monte Carlo Simulation for Magnetic Domain Structure and Hysteresis Properties
  • Feb 28, 2011
  • Katsuhiko Yamaguchi + 2 more

Recently many studies for magnetic process simulations of micro magnetic clusters have been performed using several calculation methods. These studies are expected to be available to realize high-density magnetic memories, new micro-magnetic devices or to analyze microscopically for magnetic non destructive evaluation. Monte Carlo (MC) method is one of useful and powerful methods to simulate magnetic process for magnetic clusters including complicated interaction such as different exchange interactions due to different elements and to introduce magnetic properties depending on temperature. To apply MC method for magnetic process simulation, there were some problems. One is that MC method is originally dealing with stable states, that is, the time processes on MC simulations can not be usually recognized as the real changes on time, e.g. for hysteresis curves (M-H curves) with increasing and decreasing applied magnetic field. Then a pseudodynamic process for MC method is introduced for dealing with such a simulation on section 2. Next problem is that the MC calculation for large clusters demands huge CPU time because it is necessary to repeat MC step (MCS) until N for the cluster cell number N. Especially the magnetic dipole interaction which is included in Hamiltonian must be calculated among all the spins in the cluster. Then a new technique of MC method by a parallelized program is introduced for dealing with larger cluster on section 3. The useful calculation results using these MC methods are presented on following sections. Section 4 introduces the producing of magnetic domains and domain walls (DWs) for the clusters including spins affected by exchange interaction, magnetic dipole interaction and crystal anisotropy. On section 5, magnetic domain wall displacements (DWDs) are shown for nanowires with local magnetic impurity. On section 6, M-H curves are shown for magnetic clusters with a local magnetic distribution corresponding with grain boundary of Ni based alloy. For elementary theory on MC method, previous chapter should be referred.

  • Research Article
  • 10.1158/1538-7445.sabcs21-p3-19-10
Abstract P3-19-10: Subcutaneous layer dosimetry of the breast and chest wall at clinical beam energies without bolus: A Monte Carlo and analytical anisotropic algorithm (AAA) calculation study
  • Feb 15, 2022
  • Cancer Research
  • Dylan Narinesingh + 3 more

PURPOSE: Breast skin thickness is 3 mm (1). The subcutaneous layer (SL) lies immediately beneath the skin and is at risk for local recurrence after breast cancer surgery and adjuvant radiotherapy. As the SL lies within the buildup region for megavoltage radiation treatment, placement of bolus for patients receiving chest wall (CW) radiotherapy (RT) is routine at many centers. The dermal toxicity of bolus is well known: in 12 studies of CW RT, the pooled risk of Grade 3 acute toxicity is 9.6% with bolus and 1.2% without bolus (2). Meanwhile, bolus is rarely used after breast-conserving surgery (BCS) RT for patients with similar cancers. This study examines variation in tangential RT dose coverage of the SL in the intact breast and bolus-free CW with clinically relevant photon beam energies using Monte Carlo (MC) calculations and a commercial treatment planning system (TPS). METHODS: Thirty CT datasets from patients without skin involvement were identified. There were two groups of patients: 15 treated with BCS RT and 15 treated with CW RT. In each group, 5-patient subgroups had tangent-beam RT planned without bolus with 6, 10, and 15 MV photons, respectively, using the Analytical Anisotropic Algorithm (AAA) algorithm (v.13.6.23) in Eclipse v.15.6 (Varian Medical Systems Inc., Palo Alto, CA). On each CT, the SL was segmented as a high-resolution shell from 3 to 5 mm below the body contour in the Eclipse TPS v.15.6 (Varian Medical Systems Inc., Palo Alto, CA). A 1x1x1 mm MC dose simulation was performed using EGSnrc code (BEAMnrc/DOSXYZnrc). The MC dose distributions were imported back into the TPS for comparison with AAA calculations. The V95% and V90% for the SL were calculated for each case and the mean V95% and V90% were reported for each subgroup. A t-test was used with a two-sided alpha = 0.05 for statistical analysis. RESULTS: The mean separation increased with use of higher energies for both BCS and CW RT. The MC-calculated mean SL V90% and V95% were higher for CW RT than for BCS RT at each energy. The V90% coverage was 91.5% for CW and 74.4 % for BCS at 6 MV (p<0.001), 89.3% for CW and 61.3% for BCS (p<0.001) at 10 MV and 87.1% for CW and 60.9% for BCS (p<0.001) at 15 MV (Table 1). For SL V95% the CW coverage was higher than the BCS coverage for every energy. For SL V90% at 6 MV, the AAA and MC calculations agreed within 2.5%, with the MC being slightly higher. The agreement between AAA and MC decreased for higher energies with MC reporting higher SL V90% coverage by up to 16.3%. The higher MC-calculated dose to the SL is consistent with the literature (3). CONCLUSION: MC and AAA SL dose calculations agreed well for 6 MV, but AAA underestimated the dose for 10 and 15 MV. For 6-15 MV photons, the MC-calculated dosimetric coverage of the SL is higher for CW RT than BCS RT. Since radiation oncologists are satisfied with the SL coverage by BCS RT, bolus is not needed for CW RT, because, without bolus, CW RT delivers a higher SL dose than BCS RT. REFERENCES:. 1.Pope TL Jr, et al.J Can Assoc Radiol. 1984 Dec;35(4):365-8. PMID: 6526847. 2.Dahn HM, et al. Crit Rev Oncol Hematol. 2021 Jun 5;163:10339. PMID: 34102286. 3.Panettieri V, et al., Radiother Oncol 2009; 93: 94-101 Table 1.Monte Carlo calculated mean V90% and V95% for Breast and Chest Wall for each energyV95%V95%V95%V90%V90%V90%Energy(MV)Chest Wall(Mean)Breast. (Mean)p-valueChest Wall(Mean)Breast. (Mean)p-value661.9%35.1%<0.00191.5%74.4%<0.0011065.2%39.7%<0.00189.3%61.3%<0.0011561.7%38.9%0.01287.1%60.9%<0.001 Citation Format: Dylan Narinesingh, Alan Nichol, Alanah Bergman, Tony Popescu. Subcutaneous layer dosimetry of the breast and chest wall at clinical beam energies without bolus: A Monte Carlo and analytical anisotropic algorithm (AAA) calculation study [abstract]. In: Proceedings of the 2021 San Antonio Breast Cancer Symposium; 2021 Dec 7-10; San Antonio, TX. Philadelphia (PA): AACR; Cancer Res 2022;82(4 Suppl):Abstract nr P3-19-10.

  • Research Article
  • 10.3389/conf.fnhum.2018.227.00139
Quantitative evaluation of functional Near Infrared Spectroscopy measurements with different source-detector separations using Monte Carlo simulation
  • Jan 1, 2018
  • Frontiers in Human Neuroscience
  • Lei Wang + 2 more

Quantitative evaluation of functional Near Infrared Spectroscopy measurements with different source-detector separations using Monte Carlo simulation

  • Book Chapter
  • Cite Count Icon 1
  • 10.5772/14942
Monte Carlo Simulations of Grain Growth in Metals
  • Feb 28, 2011
  • Sven K.

The application of the Monte Carlo (MC) method to simulate the grain growth in metals originates from Potts’ model for magnetic domain evolution (Potts, 1952), which generalized the two-state spin up or spin down ferromagnetic Ising model to systems with arbitrary spin degeneracy. Subsequently, the so-called n-fold method for expediting simulations of the time evolution of systems was developed (Bortz et al., 1975). Anderson and his co-workers were the first to introduce the Potts model into grain growth simulations, applying this method to model the grain growth kinetics (Anderson et al., 1984), grain size distribution and topology (Srolovitz et al., 1984a), influence of particle dispersions (Srolovitz et al., 1984b), anisotropic grain boundary energies (Grest et al., 1985) as well as abnormal grain growth (Srolovitz et al., 1985; Rollett et al., 1989; Rollett & Mullins, 1996). By incorporating specific elements corresponding to various microstructural processes into the basic algorithm, the MC method has been adapted to model for instance grain growth in twophase materials (Holm et al., 1993) and composites (Miodownik et al., 2000), abnormal grain growth (Lee at al., 2000, Messina et al., 2001; Ivasishin et al., 2004), static recrystallization (Srolovitz et al., 1986; Srolovitz et al., 1988; Rollett et al., 1992a, Rollett & Raabe, 2001; Song & Rettenmayr, 2002)), dynamic recrystallization (Peczak, 1995; Rollett et al., 1992b) and sintering (Hassold, et al., 1990; Chen et al., 1990, Matsubara, 1999), and it has been demonstrated that such MC simulations are capable of reproducing the essential features of these microstructural phenomena. Nowadays, the MC method is often preferred to deterministic methods such as cellular automaton (Geiger et al., 2001) and phase-field models (Tikare et al., 1998) at the mesoscopic level, mainly due to its inherent simplicity and flexibility. More recently, the MC method has also been employed to predict the final microstructures in engineering applications (Yang et al., 2000; Yu & Esche, 2005). For quite some time, numerous efforts geared toward improving the accuracy and efficiency of the conventional MC method have been reported in the literature (Radhakrishnan & Zacharia, 1995; Song & Liu, 1998, Yu & Esche, 2003a), aiming at providing the foundation for the application of the MC method in engineering practice. Various modifications of the conventional Monte Carlo (CMC) algorithm have been reported. For instance, an increase in processing speed of up to two orders of magnitude compared with the CMC algorithm were achieved in grain growth simulations by employing a modified MC algorithm (Yu & Esche, 2003a). Furthermore, this modified algorithm also led to an improved accuracy of the predicted grain growth exponent in the kinetic equations, particularly in small grain size

  • Research Article
  • Cite Count Icon 1
  • 10.1016/j.engstruct.2024.118787
Prediction and analysis of damage to RC columns under close-in blast loads based on machine learning and Monte Carlo method
  • Aug 15, 2024
  • Engineering Structures
  • Dingkun Yang + 2 more

Prediction and analysis of damage to RC columns under close-in blast loads based on machine learning and Monte Carlo method

  • Research Article
  • Cite Count Icon 3
  • 10.1016/j.ijrobp.2007.06.036
Evaluation of Uncertainty-Based Stopping Criteria for Monte Carlo Calculations of Intensity-Modulated Radiotherapy and Arc Therapy Patient Dose Distributions
  • Sep 14, 2007
  • International Journal of Radiation Oncology*Biology*Physics
  • Barbara Vanderstraeten + 5 more

Evaluation of Uncertainty-Based Stopping Criteria for Monte Carlo Calculations of Intensity-Modulated Radiotherapy and Arc Therapy Patient Dose Distributions

  • Research Article
  • 10.1118/1.4814955
SU-E-T-525: Developing a GYN Cs-Selectron Brachytherapy Treatment Planning Software Accounting for Inter-Source, Applicator and Heterogeneity Effects
  • Jun 1, 2013
  • Medical Physics
  • H Safigholi + 9 more

Purpose: In this project new treatment software based on TG‐43U1 and Monte Carlo (MC) simulations for treatment of GYN cancers were developed. This treatment software accounted for inter‐source effect, applicator attenuation, and tissue heterogeneity for Cs‐137 selectron machine. Methods: MC Linear TG‐43U1 functions for combination of 8 active‐inactive pellets with 2 cm physical length for filling tandem and ovoid were obtained. Total length of tandem and ovoid filled with three and one linear sources, respectively. With this approach inter‐source effect and applicator attenuations have been hidden in TG‐43U1 functions. To account heterogeneity effect in software design, the library of dose inhomogeneity correction factor (DICF) for various different thickness and position respect to one active Cs pellet for air and aluminum (as a bone) were simulated. A non‐homogen GYN polyethylene phantom were designed, which contain a cylinder of air as a rectum, and two symmetric aluminum spheres as a femur bone heads. GYN reference dosimetry points for TLD detectors were machined in the phantom according ICRU‐38 report. Finally reference points dose rate were compared using new software, STPS software, TLD measurements and full MC simulations. Results: New software data have good agreement with TLD and MC simulations (up to 5%). However STPS data for more reference points are less than TLD and MC simulation (up to 3%) due to not considered applicators and inter‐source attenuation. For tip of tandem and ovoid STPS data are more than TLD, and MC about 26% and 17%, respectively. More over DICF data can create data with accuracy of 2% in comparison to TLD and MC measurements. Conclusion: In this project for first time developed a 3D GYN‐Cs‐Selectron treatment planning software (TPS) based on Linear TG‐43U1 functions and libraries of MC simulations which modified inter‐source effect, applicator attenuations, and heterogeneity effects.

  • Research Article
  • Cite Count Icon 6
  • 10.13182/nse12-25
Uncertainty Quantification of Few-Group Diffusion Theory Constants Generated by the B1 Theory-Augmented Monte Carlo Method
  • Sep 1, 2013
  • Nuclear Science and Engineering
  • Ho Jin Park + 3 more

The B1 theory-augmented Monte Carlo (MC) method has been recently presented as a new MC method to generate homogenized few-group diffusion theory constants (FGCs) of nuclear systems such as a fuel pin cell or a fuel assembly (FA). It is demonstrated that it can produce FGCs that are well qualified for highly accurate two-step core neutronics analyses. However, it is unavoidable for FGCs from it to carry uncertainties that are ascribed to statistical uncertainties, as well as nuclear cross-section and nuclide number density input data uncertainties, of MC calculations pivotal in the new MC method. In order to evaluate the impact of these uncertainties of FGCs on the core neutronics design applications, therefore, it becomes essential to present their uncertainties quantitatively in addition to FGCs themselves. The purpose of this paper is to develop a mathematical formulation for separately quantifying contributions of statistical and input data uncertainties to uncertainties of FGCs from the new MC method and to illustrate its applications for computing uncertainties of the burnup-dependent FGCs. To do so, the basic mathematical equations linking input uncertainties to output uncertainties are established in terms of an arbitrary single-step computational problem that requires either a MC or a deterministic method calculation. It is shown that repeated applications of the basic equations stepwise from steps 1 through 5 of the new MC method at the very beginning of the preset burnup intervals lead to a desired formulation that can not only quantify uncertainties of the burnup-dependent FGCs but also separately identify individual contributions of uncertain sources to them. The formulation is incorporated into the Seoul National University MC code McCARD. It is then used to compute the uncertainties of the burnup-dependent homogenized two-group constants of a low-enriched UO2 fuel pin cell and a pressurized water reactor FA on the assumption that nuclear cross-section input data of 235U and 238U have uncertainties as reflected in covariance files of the JENDL 3.3 library. The effects of the cross-section input data uncertainties of the two U isotopes on the uncertainties of two-group constants and on those of neutron multiplication factors of the UO2 pin cell and the FA are quantified. The utilities of uncertainty quantifications are then discussed from the standpoint of evaluation of feasibility of nuclear design results of new reactor systems and improvement of the nuclear data including covariance files of the evaluated nuclear data libraries.

  • Research Article
  • 10.1149/ma2023-02401962mtgabs
Molecular Dynamics Analysis of the Scattering Phenomena of Oxygen Molecules on an Ionomer Surface in Catalyst Layer of Fuel Cell
  • Dec 22, 2023
  • Electrochemical Society Meeting Abstracts
  • Keisuke Mizuki + 3 more

As energy demand increases and global warming progresses, expectations for fuel cells, which generate electricity through chemical reactions between hydrogen and oxygen, are rising. There are various types of fuel cells, which are classified according to the electrolyte. Polymer electrolyte fuel cells (PEFCs) are expected to be used in stationary power sources for home use and fuel cell vehicles because of their low operating temperature, short start-up time, and ease of miniaturization. The cost has been an issue in the widespread use of fuel cells. The amount of platinum used in the catalyst layer (CL) must be reduced to reduce costs. However, it increases the current density per unit area of platinum. There are three factors that contribute to voltage drop: ohmic polarization, activation polarization, and diffusion polarization. At high current densities, the voltage drop due to diffusion polarization is dominant. The main cause of diffusion polarization is the transport resistance of oxygen molecules in the CL, and improving the oxygen transport properties in the CL will lead to lower cost PEFCs. In this study, we analyzed the transport properties of oxygen molecules in the CLs of PEFCs using the Monte Carlo (MC) method. The effect of oxygen scattering on the overall oxygen transport properties was investigated by considering the behavior on the surface of the ionomer film (surface diffusion) into the MC method, which simulates the overall transport in the PEFC CL. The objective of this study is to reproduce accurate oxygen transport by comparing the results of MC and molecular dynamics (MD) methods. First, we used the MC method to analyze oxygen scattering phenomena. In this study, we used the 3D structure data of CL in the MC simulations. The three-dimensional structure of the sample was reconstructed from sequential slice images. Molecules were assumed to reflect specularly on the boundaries of the simulation domain. Oxygen molecules were randomly placed in the simulation system. The time step was set at 5 ps and the simulation time was set at 4 μs. In order to examine the effect of surface diffusion on the transport properties of oxygen molecules, the simulation was performed using various conditions of surface diffusion. We calculated the effective diffusion coefficient of oxygen in the CL as the transport property of oxygen molecules in the MC method. As the surface diffusion coefficient increases, the effective diffusion coefficient of oxygen increases. When the surface diffusion coefficient is small, the effective diffusion coefficient of oxygen increases due to a decrease in the time constant, which determines the probability distribution of the surface residence time. However, when the surface diffusion coefficient is large, a different trend emerges. As the time constant increases, the effective diffusion coefficient has a peak value and then decreases. Next, to verify the model of surface diffusion in the MC method, we analyzed the scattering phenomenon of oxygen molecules in a simulation system that simulates the pores in the CL using the MD method. We modeled the nano pores in the CL as a slit between the ionomer walls in the MD method. Nafion with the equivalent weight (EW) of 1146 was used as polymer model. The number of Nafion chains is set at 4, and the water content λ is set at 7. Three-dimensional periodic boundary conditions were applied to the simulation domain. After the system was equilibrated, 1 ns of NVT simulation was performed for production. The temperature was set at 297 K. The behavior of oxygen molecules impacting on the ionomer wall was analyzed. We analyzed the behavior of oxygen molecules colliding with the ionomer membrane using the MD method. The average residence time of surface diffusion on the ionomer thin film was of the order of 10-10s. When the calculation results were compared with the MC results, it was found that the average residence time obtained by the MD method was the value where the surface diffusion coefficient affects much on the effective diffusion coefficient. Comparing the results of the MD method with the surface diffusion model used in the MC method, there are some discrepancies between the results and the model. For example, the MC method used a model in which oxygen molecules reflect diffusively on the ionomer surface. However, in the MD method, it was found that oxygen molecules tend to reflect in the direction of travel (Fig. 1). Therefore, a more accurate model of the MC method needs to be developed in the future. Figure 1

  • Research Article
  • 10.1118/1.3612898
MO-B-224-01: Clinical Implementation and Application of Monte Carlo Methods in Photon and Electron Dose Calculation - New Issues to Consider in Clinical Practice
  • Jun 1, 2011
  • Medical Physics
  • N Tyagi + 1 more

Current advances in image guided radiation therapy have enabled very accurate delivery of radiation for the treatment of cancer. It then becomes more important that the radiationdose is calculated with utmost accuracy in patient geometry. For past two decades Monte Carlo(MC) methods have been known to be the most accurate methods available for dose calculation. However, it has been only recently that MC based commercial treatment planning systems are becoming available for routine use in the clinic. Even now radiation therapy community is struggling in adopting MCtreatment planning systems (TPS) as the only TPS in the clinic. This is either because of additional computational burden associated with the MCdose calculation or due to the lack of information available on clinical benchmarked data where the effect of MC is most significant. Although use of parallel processing and more recently GPU implementation of MC algorithms have overcome computational burden, the need for clinical use and significance of MC methods are still topics of debate. The main goal of this educational session is to familiarize clinical medical physicists with the application and implementation of Monte Carlo in routine radiotherapytreatment planning with the emphasis on the clinical significance. The continuing education session will be divided into two sections in order to specifically address implementation issues related to Monte Carlo‐based photon and electron treatment planning techniques. The following broad general topics will be addressed: 1. Introduction to Monte Carlo Methods: with its application to radiation treatment planning and commercial availability of MC based TPS 2. Commissioning and clinical implementation of MC based systems‐Beam data required for source modeling and commissioning of a MC based TPS. Use of MC in IMRT optimization and VMAT for photontreatment planning 3. Implementation, operational and physics related issues: Issues related to beam modeling, inherent statistical uncertainty in MCdose calculations, importance of CT‐to‐material conversion and dose reporting in terms of dose‐to‐medium versus dose‐to‐water. 4. Clinical significance of MC based treatment plans:Dosimetric differences of MCdose calculations compared to kernel based methods for treatment sites such as head and neck, spine, breast, lung etc and how it affects our current clinical practice in terms of new prescription and dose escalation. 5. MC as a QA tool? Use of MC method as an independent and additional QA tool. Effect of QA by calculating dose in actual patient geometry rather than water phantom. Accounting for machine delivery uncertainty by reconstructing doses from machine log files etc Learning Objectives: 1. A review of MC methods with its application to radiation treatment planning for photon and electron beams 2. To understand the clinical implementation and commissioning of MC based TPS and issues related to its routine clinical operation 3. To understand the clinical significance of MC methods for different treatment sites and how it change our current clinical practice 4. To understand the use of MC TPS as an additional and independent QA tool in routine clinical practice.

  • Research Article
  • Cite Count Icon 1
  • 10.1118/1.4735105
SU-E-T-49: Verification of the Monte Carlo Model in the BrainLAB IPlan System for Clinical Applications
  • Jun 1, 2012
  • Medical Physics
  • Z Wang + 4 more

Purpose: To verify the Monte Carlo (MC) model in the BrainLab iPlan treatment planning system (TPS) in homogeneous and inhomogeneous media for clinical applications and to evaluate the BL Imaging Couch Top (BICT) in the TPS. Methods: One 30×30×12 cm3 solid water phantom and one 30×30×18 cm3 inhomogeneous phantom made of solid water and cork board were CT scanned and the images were transferred to the iPlan TPS. Single field plans (from 2×2 to 15×15 cm2) were calculated on the two phantoms for AP, PA and oblique gantry angles. An IMRT and a dynamic arc plans were also calculated. Each plan was calculated using both the Pencil Beam Convolution (PBC) and the MC models and with insertion of the BICT. All plans were delivered at the Novalis TX and the isocenter doses were measured using the A‐14 ion chamber.Results: For the homogeneous phantom, planned doses from the PBC and the MC models agree within 2% and all the planned doses agree with the measured doseswithin 2%. For the inhomogeneous phantom, the planned doses from the PBC and the MC models differ from 1.6% to 4.8% depending on the field sizes. The measurements agree with the MC plans within 1.2%. The differences between the measurements and the PBC plans vary from −1.3% to −4.0%. With the BICT included in the plans, the measured doses for the single PA fields differ with the MC planned doses from 0.5% to 3.7%. However, for the six‐beam IMRT and the six‐dynamic arc plans, the measured total doses agree with the planned doses with in 2%. Conclusions: This work verified that the MC model in the BrainLab iPlan TPS is ready for clinical applications in both homogeneous and inhomogeneous media. The couch model in the TPS was evaluated and is acceptable.

More from: The Science of the total environment
  • New
  • Research Article
  • 10.1016/j.scitotenv.2025.180845
Reduced-form air quality dispersion modeling for urban scale traffic-related pollutants.
  • Nov 7, 2025
  • The Science of the total environment
  • Sang-Jin Lee + 1 more

  • New
  • Research Article
  • 10.1016/j.scitotenv.2025.180890
Impacts of extreme climate change on terrestrial ecosystem carbon storage in China.
  • Nov 7, 2025
  • The Science of the total environment
  • Junjie Jin + 7 more

  • New
  • Research Article
  • 10.1016/j.scitotenv.2025.180899
Current status and future emission reduction pathways of NOx in China.
  • Nov 7, 2025
  • The Science of the total environment
  • Qian Wu + 9 more

  • New
  • Research Article
  • 10.1016/j.scitotenv.2025.180866
Pesticide spray drift and risk assessment using unmanned aerial vehicle (UAV) sprayer and traditional electric knapsack sprayer (EKS).
  • Nov 7, 2025
  • The Science of the total environment
  • Xue Chen + 9 more

  • New
  • Research Article
  • 10.1016/j.scitotenv.2025.180878
Cellular insights into reactive oxidative species (ROS) and bacterial stress responses induced by antimicrobial blue light (aBL) for inactivating antibiotic resistant bacteria (ARB) in wastewater.
  • Nov 7, 2025
  • The Science of the total environment
  • Xiaoyu Cong + 3 more

  • New
  • Research Article
  • 10.1016/j.scitotenv.2025.180828
Integrated assessment of dissolved oxygen dynamics using optimized M-K trend detection and ridge regression in the Middle and Lower Yellow River Basin.
  • Nov 7, 2025
  • The Science of the total environment
  • Jiafang Wei + 3 more

  • New
  • Research Article
  • 10.1016/j.scitotenv.2025.180902
Expanding protected areas can benefit mammalian and ecosystemic biodiversity.
  • Nov 7, 2025
  • The Science of the total environment
  • Emmanuel Paradis

  • New
  • Addendum
  • 10.1016/j.scitotenv.2025.180888
Corrigendum to "Earthworms mediate the influence of polyethylene (PE) and polylactic acid (PLA) microplastics on soil bacterial communities" [Sci. Total Environ., Volume 905, 20 December 2023, 166959
  • Nov 7, 2025
  • The Science of the total environment
  • Siyuan Lu + 9 more

  • New
  • Research Article
  • 10.1016/j.scitotenv.2025.180857
Spring awakening? Seasonal controls on halomethoxybenzenes in arctic waters.
  • Nov 7, 2025
  • The Science of the total environment
  • Danielle Haas Freeman + 4 more

  • New
  • Research Article
  • 10.1016/j.scitotenv.2025.180847
Discovery and characterization of novel polyacrylic urethane-degrading bacteria from intestine of the red-veined darter (Sympetrum fonscolombii).
  • Nov 7, 2025
  • The Science of the total environment
  • So-Hye Lee + 4 more

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.

Search IconWhat is the difference between bacteria and viruses?
Open In New Tab Icon
Search IconWhat is the function of the immune system?
Open In New Tab Icon
Search IconCan diabetes be passed down from one generation to the next?
Open In New Tab Icon