An improved machine learning approach for predicting granular flows

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

An improved machine learning approach for predicting granular flows

Similar Papers
  • Research Article
  • Cite Count Icon 19
  • 10.1016/j.conbuildmat.2023.130321
Optimized machine learning approaches for identifying vertical temperature gradient on ballastless track in natural environments
  • Jan 16, 2023
  • Construction and Building Materials
  • Tao Shi + 1 more

Optimized machine learning approaches for identifying vertical temperature gradient on ballastless track in natural environments

  • Research Article
  • Cite Count Icon 6
  • 10.1016/j.oceaneng.2023.115222
Machine learning simulation of one-dimensional deterministic water wave propagation
  • Jul 6, 2023
  • Ocean Engineering
  • Mathies Wedler + 3 more

Machine learning simulation of one-dimensional deterministic water wave propagation

  • Research Article
  • Cite Count Icon 48
  • 10.1007/s40295-019-00158-3
Machine Learning Approach to Improve Satellite Orbit Prediction Accuracy Using Publicly Available Data
  • May 14, 2019
  • The Journal of the Astronautical Sciences
  • Hao Peng + 1 more

Efficient and high precision orbit prediction is increasingly crucial for improved Space Situational Awareness. Due to the lack of the required information such as space environment conditions and characteristics of Resident Space Objects (RSOs), satellite collisions have happened, partially because that the solely physics-based approaches can fail to achieve the required accuracy for collision avoidance. With the hypothesis that a Machine Learning (ML) approach can learn the underlying pattern of the orbit prediction errors from historical data, in this paper, the Support Vector Machine (SVM) is explored for improving the orbit prediction accuracy. Two publicly available Two-Line Element (TLE) catalog and International Laser Ranging Service (ILRS) catalog are used to validate the proposed ML approach. The position and velocity components of 11 total RSOs maintained at both catalogs are studied. Results of the study demonstrate that the designed dataset structure and SVM model can improve the orbit prediction accuracy with good performance on most cases. The performance on RSOs belonging to different orbit types is analyzed using different sizes of training and testing data. Results of the paper demonstrate the potential of using the proposed ML approach to improve the accuracy of TLE catalog.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 9
  • 10.3390/molecules27227853
Quantitative Analysis of Solar Photovoltaic Panel Performance with Size-Varied Dust Pollutants Deposition Using Different Machine Learning Approaches.
  • Nov 14, 2022
  • Molecules
  • Abhishek Kumar Tripathi + 6 more

In this paper, the impact of dust deposition on solar photovoltaic (PV) panels was examined, using experimental and machine learning (ML) approaches for different sizes of dust pollutants. The experimental investigation was performed using five different sizes of dust pollutants with a deposition density of 33.48 g/m2 on the panel surface. It has been noted that the zero-resistance current of the PV panel is reduced by up to 49.01% due to the presence of small-size particles and 15.68% for large-size (ranging from 600 µ to 850 µ). In addition, a significant reduction of nearly 40% in sunlight penetration into the PV panel surface was observed due to the deposition of a smaller size of dust pollutants compared to the larger size. Subsequently, different ML regression models, namely support vector machine (SVMR), multiple linear (MLR) and Gaussian (GR), were considered and compared to predict the output power of solar PV panels under the varied size of dust deposition. The outcomes of the ML approach showed that the SVMR algorithms provide optimal performance with MAE, MSE and R2 values of 0.1589, 0.0328 and 0.9919, respectively; while GR had the worst performance. The predicted output power values are in good agreement with the experimental values, showing that the proposed ML approaches are suitable for predicting the output power in any harsh and dusty environment.

  • Research Article
  • Cite Count Icon 4
  • 10.1029/2023ms004138
A Machine Learning Bias Correction on Large‐Scale Environment of High‐Impact Weather Systems in E3SM Atmosphere Model
  • Aug 1, 2024
  • Journal of Advances in Modeling Earth Systems
  • Shixuan Zhang + 6 more

Large‐scale dynamical and thermodynamical processes are common environmental drivers of high‐impact weather systems causing extreme weather events. However, such large‐scale environmental conditions often display systematic biases in climate simulations, posing challenges to evaluating high‐impact weather systems and extreme weather events. In this paper, a machine learning (ML) approach was employed to bias correct the large‐scale wind, temperature, and humidity simulated by the atmospheric component of the Energy Exascale Earth System Model (E3SM) at ∼1° resolution. The usefulness of the ML approach for extreme weather analysis was demonstrated with a focus on three high‐impact weather systems, including tropical cyclones (TCs), extratropical cyclones (ETCs), and atmospheric rivers (ARs). We show that the ML model can effectively reduce climate bias in large‐scale wind, temperature, and humidity while preserving their responses to imposed climate change perturbations. The bias correction is found to directly improve water vapor transport associated with ARs, and representations of thermodynamical flows associated with ETCs. When the bias‐corrected large‐scale winds are used to drive a synthetic TC track forecast model over the Atlantic basin, the resulting TC track density agrees better with that of the TC track model driven by observed winds. In addition, the ML model insignificantly interferes with the mean climate change signals of large‐scale storm environments as well as the occurrence and intensity of three weather systems. This study suggests that the proposed ML approach can be used to improve the downscaling of extreme weather events by providing more realistic large‐scale storm environments simulated by low‐resolution climate models.

  • Research Article
  • Cite Count Icon 21
  • 10.1016/j.engfracmech.2018.04.041
A machine learning approach for the identification of the Lattice Discrete Particle Model parameters
  • Apr 30, 2018
  • Engineering Fracture Mechanics
  • Mohammed Alnaggar + 1 more

A machine learning approach for the identification of the Lattice Discrete Particle Model parameters

  • Research Article
  • Cite Count Icon 57
  • 10.1109/ted.2019.2937786
Prediction of Process Variation Effect for Ultrascaled GAA Vertical FET Devices Using a Machine Learning Approach
  • Oct 1, 2019
  • IEEE Transactions on Electron Devices
  • Kyul Ko + 4 more

In this brief, we present an accurate and efficient machine learning (ML) approach which predicts variations in key electrical parameters using process variations (PVs) from ultrascaled gate-all-around (GAA) vertical FET (VFET) devices. The 3-D stochastic TCAD simulation is the most powerful tool for analyzing PVs, but for ultrascaled devices, the computation cost is too high because this method requires simultaneous analysis of various factors. The proposed ML approach is a new method which predicts the effects of the variability sources of ultrascaled devices. It also shows the same degree of accuracy, as well as improved efficiency compared to a 3-D stochastic TCAD simulation. An artificial neural network (ANN)-based ML algorithm can make multi-input -multi-output (MIMO) predictions very effectively and uses an internal algorithm structure that is improved relative to existing techniques to capture the effects of PVs accurately. This algorithm incurs approximately 16% of the computation cost by predicting the effects of process variability sources with less than 1% error compared to a 3-D stochastic TCAD simulation.

  • Research Article
  • Cite Count Icon 42
  • 10.1109/tvt.2020.3001340
Machine Learning Enabling Analog Beam Selection for Concurrent Transmissions in Millimeter-Wave V2V Communications
  • Aug 1, 2020
  • IEEE Transactions on Vehicular Technology
  • Yang Yang + 4 more

With the development of millimeter-wave (mmWave) technology and vehicle-to-vehicle (V2V) communications, the mmWave vehicular ad hoc networks (VANETs) is envisioned to support rapidly growing number of vehicles. Against this background, each V2V user (VUE) is expected to employ large-scale array to form directional analog beams for improving spatial spectrum reuse, and it is capable of achieving concurrent transmissions from multiple other VUEs simultaneously. However, due to the high dynamics of V2V links, it can be challenging for each VUE to quickly select an effective analog beam. In this paper, we propose a machine learning (ML) approach to achieve an efficient and fast analog beam selection for mmWave V2V communications. Specifically, we first derive the probabilities that multiple V2V transmitters (TXs) serve one VUE to obtain the average sum rate (ASR) for mmWave V2V communications. On that basis, we develop an ML approach to maximize the ASR, whereby the support vector machine (SVM) classifier is utilized for optimizing the analog beam selection. Besides, we further proposed an iteration sequential minimal optimization training algorithm to train data samples of all V2V links, and convergence of the proposed solution is also discussed. Finally, an extensive sample training and simulations are evaluated by Google TensorFlow. The results verified that our proposed ML approach is capable of achieving a higher ASR yet substantially lower computational complexity than traditional solutions based on explicitly estimated channels.

  • Research Article
  • Cite Count Icon 1
  • 10.3390/fluids10020039
A Machine Learning Approach to Volume Tracking in Multiphase Flow Simulations
  • Feb 2, 2025
  • Fluids
  • Aaron Mak + 1 more

This work presents a machine learning (ML) approach to volume-tracking for computational simulations of multiphase flow. It is an alternative to a commonly used procedure in the volume-of-fluid (VOF) method for volume tracking, in which the phase interfaces are reconstructed for flux calculation followed by volume advection. Bypassing the computationally expensive steps of interface reconstruction and flux calculation, the proposed ML approach performs volume advection in a single step, directly predicting the volume fractions at the next time step. The proposed ML function is two-dimensional and has eleven inputs. It was trained using MATLAB’s (R2021a) Deep Learning Toolbox with a grid search method to find an optimal neural network configuration. The performance of the ML function is assessed using canonical test cases, including translation, rotation, and vortex tests. The errors in the volume fraction fields obtained by the ML function are compared with those of the VOF method. In ideal conditions, the ML function speeds up the computations four times compared to the VOF method. However, in terms of overall robustness and accuracy, the VOF method remains superior. This study demonstrates the potential of applying ML methods to multiphase flow simulations while highlighting areas for further improvement.

  • Conference Article
  • Cite Count Icon 2
  • 10.1115/fedsm2021-65229
Machine Learning Approach to Predict Sand Transport in Horizontal and Inclined Flow
  • Aug 10, 2021
  • R E Vieira + 3 more

Model predictions are routinely used to help in the decision-making process. For instance, in the oil and gas industry, the accumulation of solid particles, such as sand, and the formation of a bed of solids at the bottom of the pipe can be consequential. Such accumulation may decrease the efficiency of the pipeline due to the increase in the frictional pressure loss; increase the risk of pipeline damage due to erosion; or increase the possibility of pipeline corrosion damage under the bed of solids. In order to transport the solid particles in the pipe, the fluid velocity must exceed the critical velocity required for solid particle transport. Mechanistic models are used to provide a reasonable estimate for the critical velocity needed to transport the particles. However, those models are commonly applicable in their respective ranges of data fitting; and are limited by the applicability of the empirically based closure relations that are a part of such models. On the other hand, the accumulation of experimental data makes possible the application of data-driven methods for characterizing multiphase flow for a broader range of flow conditions. This paper presents a framework to predict the fluid velocity needed to transport solid particles in a pipeline via machine learning (ML) approach. In order to prepare a dataset for training ML models, the critical velocity data are collected from available sources in literature. With the purpose of decreasing the number of input parameters for ML algorithms and to make the model similar for different types of carrying fluids, a set of dimensionless variables has been used. To create the predictive models, three ML algorithms are applied: Random Forest, Support Vector Machine, and Gradient Boosting. The fine-tuned models are compared using statistical analysis to identify the ones that provide the most accurate velocity predictions for different operating conditions. Moreover, the predictive abilities of the models are further validated by comparing their performance with different mechanistic models. The proposed ML approach demonstrates high accuracy in predicting critical velocity across a wide range of flow conditions and inclination angles.

  • Preprint Article
  • Cite Count Icon 1
  • 10.5194/egusphere-egu2020-690
Are Machine Learning methods robust enough for hydrological modeling under changing conditions?
  • Jul 17, 2020
  • Carolina Natel De Moura + 3 more

<p>The advancement of big data and increased computational power have contributed to an increased use of Machine Learning (ML) approaches in hydrological modelling. These approaches are powerful tools for modeling non-linear systems. However, the applicability of ML in non-stationary conditions needs to be studied further. As climate change will change hydrological patterns, testing ML approaches for non-stationary conditions is essential. Here, we used the Differential Split-Sample Test (DSST) to test the climate transposability of ML approaches (e.g., calibrating in a wet period and validating in a dry one, and vice-versa).  We applied five ML approaches using daily precipitation and temperature as input for the prediction of the daily discharge in six snow-dominated Swiss catchments. Lower and upper benchmarks were used to evaluate performances through a relative performance measure. The lower benchmark is the average of the bucket-type HBV model runs from 1000 random parameter sets. The upper benchmark is the automatically calibrated HBV model. In comparison with the stationary condition, the models performed slightly poorer in the non-stationary condition. The performance of simple ML approaches was poor for non-stationary conditions with an underestimation of peak flows, as well as a poor representation of the snow-melting period. On the other hand, a more complex ML approach (deep learning), the Long Short -Term Memory (LSTM), showed a good performance when compared with the lower and upper benchmarks. This might be explained by the fact that the so-called memory cell allowed to simulate the storage effects. </p>

  • Research Article
  • Cite Count Icon 10
  • 10.1016/j.compstruct.2022.115335
Machine learning approach for delamination detection with feature missing and noise polluted vibration characteristics
  • Feb 12, 2022
  • Composite Structures
  • Yushu Li + 5 more

Machine learning approach for delamination detection with feature missing and noise polluted vibration characteristics

  • Research Article
  • Cite Count Icon 6
  • 10.1016/j.jval.2024.12.010
Do Machine Learning Approaches Perform Better Than Regression Models in Mapping Studies? A Systematic Review.
  • May 1, 2025
  • Value in health : the journal of the International Society for Pharmacoeconomics and Outcomes Research
  • Tianqi Hong + 4 more

Do Machine Learning Approaches Perform Better Than Regression Models in Mapping Studies? A Systematic Review.

  • Research Article
  • Cite Count Icon 40
  • 10.1016/j.engappai.2022.105439
Machine learning approach for truck-drones based last-mile delivery in the era of industry 4.0
  • Sep 20, 2022
  • Engineering Applications of Artificial Intelligence
  • Ali Arishi + 2 more

Machine learning approach for truck-drones based last-mile delivery in the era of industry 4.0

  • Research Article
  • Cite Count Icon 2
  • 10.3389/frai.2024.1412865
Predicting overall survival from tumor dynamics metrics using parametric statistical and machine learning models: application to patients with RET-altered solid tumors.
  • Jun 11, 2024
  • Frontiers in artificial intelligence
  • Erick Velasquez + 7 more

In oncology drug development, tumor dynamics modeling is widely applied to predict patients' overall survival (OS) via parametric models. However, the current modeling paradigm, which assumes a disease-specific link between tumor dynamics and survival, has its limitations. This is particularly evident in drug development scenarios where the clinical trial under consideration contains patients with tumor types for which there is little to no prior institutional data. In this work, we propose the use of a pan-indication solid tumor machine learning (ML) approach whereby all three tumor metrics (tumor shrinkage rate, tumor regrowth rate and time to tumor growth) are simultaneously used to predict patients' OS in a tumor type independent manner. We demonstrate the utility of this approach in a clinical trial of cancer patients treated with the tyrosine kinase inhibitor, pralsetinib. We compared the parametric and ML models and the results showed that the proposed ML approach is able to adequately predict patient OS across RET-altered solid tumors, including non-small cell lung cancer, medullary thyroid cancer as well as other solid tumors. While the findings of this study are promising, further research is needed for evaluating the generalizability of the ML model to other solid tumor types.

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.