Discovery Logo
Sign In
Paper
Search Paper
Cancel
Pricing Sign In
  • My Feed iconMy Feed
  • Search Papers iconSearch Papers
  • Library iconLibrary
  • Explore iconExplore
  • Ask R Discovery iconAsk R Discovery Star Left icon
  • Chat PDF iconChat PDF Star Left icon
  • Citation Generator iconCitation Generator
  • Chrome Extension iconChrome Extension
    External link
  • Use on ChatGPT iconUse on ChatGPT
    External link
  • iOS App iconiOS App
    External link
  • Android App iconAndroid App
    External link
  • Contact Us iconContact Us
    External link
  • Paperpal iconPaperpal
    External link
  • Mind the Graph iconMind the Graph
    External link
  • Journal Finder iconJournal Finder
    External link
Discovery Logo menuClose menu
  • My Feed iconMy Feed
  • Search Papers iconSearch Papers
  • Library iconLibrary
  • Explore iconExplore
  • Ask R Discovery iconAsk R Discovery Star Left icon
  • Chat PDF iconChat PDF Star Left icon
  • Citation Generator iconCitation Generator
  • Chrome Extension iconChrome Extension
    External link
  • Use on ChatGPT iconUse on ChatGPT
    External link
  • iOS App iconiOS App
    External link
  • Android App iconAndroid App
    External link
  • Contact Us iconContact Us
    External link
  • Paperpal iconPaperpal
    External link
  • Mind the Graph iconMind the Graph
    External link
  • Journal Finder iconJournal Finder
    External link

Related Topics

  • Probability Of Estimation
  • Probability Of Estimation
  • Probability Estimates
  • Probability Estimates
  • Probability Estimator
  • Probability Estimator

Articles published on Probability estimation

Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
5141 Search results
Sort by
Recency
  • New
  • Research Article
  • 10.1016/j.cognition.2025.106338
Low-certainty modals not future tenses cause increased psychological discounting in English relative to Dutch.
  • Feb 1, 2026
  • Cognition
  • Cole Robertson + 5 more

Low-certainty modals not future tenses cause increased psychological discounting in English relative to Dutch.

  • New
  • Research Article
  • 10.1016/j.jag.2025.105010
An improved gap probability estimation method accounting for radiometric effects in airborne LiDAR intensity
  • Feb 1, 2026
  • International Journal of Applied Earth Observation and Geoinformation
  • Lijie Guo + 1 more

An improved gap probability estimation method accounting for radiometric effects in airborne LiDAR intensity

  • New
  • Research Article
  • 10.1038/s41598-026-36162-5
Machine learning prediction of food addiction in university students using demographic, anthropometric and personality traits.
  • Jan 30, 2026
  • Scientific reports
  • Ali Rahimnezhad + 5 more

People's eating habits are influenced by psychological, social, cultural, and behavioral factors. Research shows that certain personality types expose people to risky eating behaviors. Given the complexity of nutrition-related factors and the limitations of traditional statistical methods, the use of new approaches such as artificial intelligence and machine learning can play an effective role in analyzing multidimensional data and identifying complex patterns. This cross-sectional pilot study aimed to predict food addiction among university students by integrating demographic, anthropometric and personality data with machine learning methods. The data consisted of 210 samples, which were first preprocessed to ensure data quality and integrity. Tomek Links and SMOTE techniques were used to remove class imbalance. Feature selection was performed using the twelve different algorithms to identify the most important features related to food addiction prediction. Then, ten different machine learning models were implemented, including Logistic Regression (LR), K-Nearest Neighbors (KNN), Gaussian Naive Bayes (GNB), Support Vector Classifier (SVC) with probability estimation, Decision Tree (DT), Random Forest (RF), AdaBoost, Gradient Boosting Classifier (GBC), CatBoost and LightGBM. The models were trained on the training dataset and their performance was evaluated using the accuracy, precision, recall, F1-Score and AUC metrics on the test dataset. In addition, the SHAP (SHapley Additive exPlanations) method was used to analyze the importance of features and interpret the advanced models to determine the impact of each psychological and behavioral feature on the prediction of food addiction. The results showed that more advanced models, especially ensemble methods such as Random Forest and CatBoost, have high power in identifying complex patterns and accurately predicting food addiction behaviors. SHAP analysis also showed that psychological characteristics such as feelings of worthlessness, impulsivity, anger, psychological distress, rigid cognitive styles, weight and height, body mass index (BMI) were related the most important factors affecting prediction. Although limitations such as small sample size, focusing on a specific student population, and the use of self-report instruments reduce the generalizability of the results, the innovation of this study in combining psychological and artificial intelligence approaches for early identification of high-risk individuals is remarkable. Overall, the integration of personality profiles with advanced computational models can form the basis for the development of artificial intelligence-based screening tools and targeted interventions to improve nutritional behaviors in young populations.

  • New
  • Research Article
  • 10.1186/s12913-026-14075-3
Early viability assessment of a Business-to-Consumer (B2C) model for digital diabetes screening in Switzerland.
  • Jan 28, 2026
  • BMC health services research
  • Wasu Mekniran + 1 more

Type 2 diabetes (T2D) represents a rapidly growing public health and economic burden. Although early intervention can reverse the progression of prediabetes, traditional risk screening remains underutilized. Digital biomarkers derived from smartphones and wearables offer scalable real-time detection, yet financial barriers constrain their integration into health systems. This study assesses the viability of a Business-to-Consumer (B2C) digital diabetes screening venture within the Swiss healthcare system. The Innovating in Healthcare Framework was applied to evaluate system alignment across six factors: structure, financing, public policy, technology, consumers, and accountability. Financial viability was modeled using a Monte Carlo program for probabilistic breakeven estimation, and one- and two-way sensitivity analyses for key funnel, price, and CAC variables. A discounted cash flow model assessed value creation. Sustainability was evaluated across four dimensions: revenue potential, cost efficiency, managerial scalability, and technological adaptability. The venture showed strong alignment with consumer readiness, technology, and accountability, but weak fit in financing, public policy, and system structure. Financial modeling indicated positive cash flow in Year 4, with a 57% probability of breakeven within seven years. At 2917 users in Year 7, cumulative cash flow was slightly below zero. Profitability becomes feasible only when price ≥CHF 40, CAC ≤CHF 200, and screening participation ≥10%. Against a-priori thresholds, breakeven probability and NPV remain insufficient, while IRR often exceeds 20%. Long-term viability requires these conditions or transitioning to reimbursed B2B pathways. A B2C model can reach financial viability under favorable price and acquisition thresholds, but its long-term sustainability ultimately depends on regulatory validation that enables reimbursement.

  • New
  • Research Article
  • 10.71097/ijsat.v17.i1.10199
Risk Quantification Models for Enterprise Hardware Launches
  • Jan 22, 2026
  • International Journal on Science and Technology
  • Amit Jha

Enterprise hardware launches involve tightly coupled risks across design readiness, supplier performance, manufacturing yield, logistics, regulatory compliance, and market timing. These risks evolve dynamically across launch phases and often propagate across functional boundaries, limiting the effectiveness of traditional qualitative risk registers. This paper presents a quantitative risk modeling framework specifically designed for enterprise hardware launches. The framework combines probabilistic risk estimation, Bayesian dependency modeling, and expected loss analysis to quantify launch readiness across pre-launch, ramp, and general availability phases. Risk likelihood and impact are derived from empirical program data, supplier metrics, validation coverage, and schedule buffers. A composite launch risk index is calculated to support objective go or no-go decisions and mitigation prioritization. A representative enterprise hardware launch case study demonstrates improved early risk visibility, stronger executive decision support, and reduced late-stage disruptions compared to qualitative methods. The results show that quantitative risk aggregation enables more accurate forecasting and proactive intervention in complex hardware programs.

  • New
  • Research Article
  • 10.1093/gji/ggaf450
Robust probabilistic estimation of statistical variations in earthquake records: application to induced seismicity in western Canada
  • Jan 22, 2026
  • Geophysical Journal International
  • Jeremy M Gosselin + 3 more

Summary Accurate characterization of the magnitude-frequency distribution of seismicity, and its associated uncertainties, is essential for seismic hazard assessment. This distribution is commonly described by the Gutenberg–Richter (GR) relation, parameterized by the b-value, which has been identified as a potential proxy for investigating many spatiotemporally varying Earth phenomena. Estimating the spatiotemporal variability of b-values often requires windowing, forcing a trade-off between resolution and statistical reliability. New probabilistic methods circumvent this by inferring both the number and locations of change points directly from earthquake catalogs. Nevertheless, accurately determining the b-value remains difficult because the GR relation only holds over a limited range of magnitudes. This research develops a general statistical model to address several methodological challenges in estimating the magnitude-frequency distribution of observed seismicity, including variations in space or time. The approach simultaneously solves for the b-value and magnitude-range limits. This avoids potential bias due to inaccurate manual truncation of earthquake catalogs. The model considers the entire observed catalog and parameterizes the decay of the distribution at both low and high magnitudes. Consequently, robust uncertainties in estimated b-values reflect uncertainty in the range of magnitudes over which the GR relation is observed to be valid. Importantly, spatiotemporal variations in the parameters that define the magnitude range are considered to be independent from the b-value, as we assume the physical factors that influence the GR relation are independent of the factors that limit the observed earthquake catalog. We demonstrate this methodology through application to simulated and observed earthquake catalogs. In particular, the value of our approach is highlighted through application to observed records of induced seismicity associated with fluid-injection operations in western Canada. Our results demonstrate accurate b-value estimates and associated uncertainties. Furthermore, the additional parameters that define the magnitude range serve as proxies for other factors including seismic network performance, recording duration, potential geometric limitations on earthquake size, and potential injection characteristics (in induced seismicity cases). Our approach also allows for the investigation of how these other factors may vary in space/time. Results from this work contribute to rigorous propagation of accurate b-value estimates, including uncertainties, into subsequent analyses such as seismic hazard models and regulatory protocols that are applied to industrial activity.

  • New
  • Research Article
  • 10.1111/2041-210x.70232
A dynamic multi‐scale occupancy model to estimate trends in habitat use in spatially and temporally complex systems
  • Jan 20, 2026
  • Methods in Ecology and Evolution
  • Erin Shepta + 4 more

Abstract Species occurrence is often influenced by changes in environmental conditions at multiple spatial and temporal scales working simultaneously in a hierarchical fashion. While previous dynamic multi‐scale occupancy modelling frameworks address dynamics at multiple spatial scales, they assume both large‐scale and small‐scale units are closed during the same period, potentially leading to bias in small‐scale estimates. We present an extended robust design that allows for the estimation of colonization and persistence probabilities across three spatiotemporal scales (large‐scale annual, small‐scale annual and small‐scale intra‐annual) simultaneously, while also accounting for imperfect detection in a Bayesian framework. We specified latent variables in a hierarchical manner, where occurrence at each scale was dependent on higher‐level latent variables. To test model performance under a variety of sampling scenarios, we simulated data across 22 data‐generating designs. As an example, we also fit the novel occupancy model to invasive Silver Carp ( Hypophthalmichthys molitrix ) detection/non‐detection data collected in the Ohio River Basin, USA. We found that the model generally performed well, even under limited replication across different scales. Model performance declined in more ‘extreme’ simulations where colonization or persistence was rare. We found that invasive carp turnover probability varied across a gradient of invasion as well as across all spatiotemporal scales, exemplifying the benefits of utilizing our multi‐scale approach. This modelling framework offers a powerful tool for disentangling the multi‐scale processes that drive species distributions across time. The model was able to identify trends in carp occurrence even with limited data and uneven sampling efforts that would have otherwise been masked by more traditional occupancy modelling approaches. Annual large‐scale and small‐scale turnover varied substantially by river section, but intra‐annual turnover was constantly low throughout the entire study area. By explicitly modelling dynamic parameters at multiple scales, this approach fills a critical gap in ecological modelling, providing the resolution to detect fine‐scale processes and the scope to inform broad‐scale management.

  • New
  • Research Article
  • 10.5194/nhess-26-103-2026
FLEMO flash – Flood Loss Estimation MOdels for companies and households affected by flash floods
  • Jan 13, 2026
  • Natural Hazards and Earth System Sciences
  • Apoorva Singh + 7 more

Abstract. In light of the increasing losses from flash floods intensified by climate change, there is a critical need for improved loss models. Loss assessments predominantly focus on fluvial flood processes, leaving a significant gap in understanding the key drivers of flash floods and the effect of emergency response on losses. To address these gaps, we introduce FLEMOflash – a novel multivariate probabilistic Flood Loss Estimation Model compilation for flash floods. The models are developed for companies and households based on survey data collected after flash flood events in 2002, 2016, and 2021 in Germany. FLEMOflash employs a data-driven feature selection approach, combining machine learning techniques (Elastic Net, Random Forest, XGBoost) to select key drivers influencing flash flood losses and Bayesian networks to model probabilistic loss estimates, including uncertainty. Model-based findings show that in extreme hazard scenarios, successful implementation of emergency measures can reduce building losses by up to 47 % for large companies. Households who knew exactly what to do during high water depth were able to reduce their building losses by 77 % and contents losses by 55 %. Thus, FLEMOflash can support risk communication and management by providing reliable estimation of flash flood losses.

  • New
  • Research Article
  • 10.3390/s26020529
Fast 3D-HEVC Depth Map Coding Method Based on Spatio-Temporal Correlation and a Two-Stage Mode Decision Framework
  • Jan 13, 2026
  • Sensors (Basel, Switzerland)
  • Erlin Tian + 2 more

Efficient intra-mode decision for depth maps assumes a pivotal role in augmenting the overall performance of 3D-HEVC. Existing research endeavors predominantly rely on fast mode screening strategies grounded in texture characteristics or machine learning techniques. These strategies, to a certain extent, mitigate the complexity of mode search. Nevertheless, these approaches often fall short of fully leveraging the intrinsic spatio-temporal correlations within depth maps. Moreover, strategies relying on deterministic classifiers exhibit insufficient discrimination reliability in regions featuring edge mutations or intricate structures. To tackle these challenges, this paper presents a two-stage fast intra-mode decision algorithm for depth maps, integrating naive Bayes probability estimation and fuzzy support vector machine (FSVM). Initially, it confines the candidate mode space through spatio-temporal prior modeling. Subsequently, FSVM is employed to enhance the decision accuracy in regions with low confidence. This methodology constructs a joint mode decision framework spanning from probability screening to refined classification. By doing so, it significantly reduces the computational burden while preserving rate-distortion performance, thereby attaining an effective equilibrium between encoding complexity and performance. Experimental findings demonstrate that the proposed algorithm reduces the average encoding time by 52.30% with merely a 0.68% increment in BDBR. Additionally, it showcases stable universality across test sequences of diverse resolutions and scenes.

  • Research Article
  • 10.1073/pnas.2518982123
The anticipation of imminent events is time-scale invariant
  • Jan 7, 2026
  • Proceedings of the National Academy of Sciences
  • Matthias Grabenhorst + 2 more

Humans predict the timing of imminent events to generate fast and precise actions, decisions, and other behaviors. Such temporal anticipation is critical over wide timescales, and especially salient over the range from hundreds of milliseconds to a few seconds. Despite advances in our understanding of basic timing behavior and its underlying neural mechanisms, it remains an open question whether anticipation is stable across these short time scales. Recent work shows that the brain models the probability density function (PDF) of events across time, suggesting a canonical mechanism for temporal anticipation. Here, we investigate whether this computation holds when the event distribution covers different time spans. We show that, irrespective of the time span, anticipation, measured as reaction time, scales with the event distribution. This demonstrates that the key computation-the estimation of event probability density-is invariant across temporal scales. We further show that the precision of anticipation is also scale invariant which contradicts Weber's law. The results are established in vision and audition, suggesting that the core computations in temporal anticipation are independent of sensory modality. Perceptual systems exploit probability estimation over time independently of temporal scale to anticipate imminent events.

  • Research Article
  • 10.1093/esj/23969873251372773
Multiparametric assessment of atrial cardiopathy in cryptogenic stroke patients: Implications for personalized clinical management.
  • Jan 1, 2026
  • European stroke journal
  • Iria López-Dequidt + 8 more

Cryptogenic stroke (CS) represents a heterogeneous group in terms of etiology. Atrial cardiopathy (AC) has emerged as a relevant underlying substrate for both stroke and atrial fibrillation (AF) in these patients. However, no reliable tools are currently available for the early and accurate identification of AC. We conducted a prospective study including consecutive patients with cardioembolic stroke due to AF (CES-AF), non-cardioembolic stroke (NCES) and cryptogenic stroke (CS). Left atrial strain (LAS) assessed by speckle-tracking echocardiography, and serum markers of AC were evaluated in CES-AF versus NCES patients using ROC curve analysis. Based on these results, we developed a logistic regression model to calculate the probability of AC in CS patients, aiming to discriminate between cardioembolic and non-cardioembolic etiology. Clinical characteristics were compared between CS patients with high (>0.5) and low (<0.5) predicted probability of AC. A total of 136 patients were included: 44 with CES-AF, 52 with NCES, and 40 with CS. The combination of N-terminal pro-brain natriuretic peptide (NT-proBNP) levels ⩾ 469 pg/mL and biplanar LAS during the contraction phase (LASct) ⩾ -10.2% demonstrated the best-performing AC biomarker combination among those evaluated for identifying cardioembolic etiology (AUC = 0.995). Based on this combination, 30% of CS patients had a predicted probability > 0.5 for AC. These patients were older (77.3 ± 8 vs 68.8 ± 10 years; p = 0.011), had more severe strokes (NIHSS score 10.1 ± 7.5 vs 4.6 ± 5.2; p = 0.024) and showed a higher incidence of AF during follow-up (6 vs 0 cases; p = 0.029). The combination of NT-proBNP levels and biplanar LASct provides highly sensitive and specific biomarkers of AC. This multiparametric model allows for individualized estimation of AC probability in CS patients, supporting its potential utility in discriminating cardioembolic from non-cardioembolic etiologies and guiding personalized clinical management.

  • Research Article
  • 10.1016/j.mechrescom.2026.104628
Probabilistic strength estimation analysis of composites considering cross-correlated random fields of local strength and apparent elastic properties
  • Jan 1, 2026
  • Mechanics Research Communications
  • S Sakata + 3 more

Probabilistic strength estimation analysis of composites considering cross-correlated random fields of local strength and apparent elastic properties

  • Research Article
  • 10.1016/j.trgeo.2026.101917
Active stability and probabilistic estimation of tunnel face in spatially variable and anisotropic soils
  • Jan 1, 2026
  • Transportation Geotechnics
  • Chen Guang-Hui + 4 more

Active stability and probabilistic estimation of tunnel face in spatially variable and anisotropic soils

  • Research Article
  • 10.1016/j.ress.2025.111493
Adaptive parallel design criterion for failure probability estimation with Student-t likelihood
  • Jan 1, 2026
  • Reliability Engineering &amp; System Safety
  • Hongdan Zheng + 4 more

Adaptive parallel design criterion for failure probability estimation with Student-t likelihood

  • Research Article
  • 10.1016/j.cmpb.2025.109131
MDCcure: An R package for martingale difference correlation and hypothesis testing in mixture cure models.
  • Jan 1, 2026
  • Computer methods and programs in biomedicine
  • Blanca E Monroy-Castillo + 2 more

MDCcure: An R package for martingale difference correlation and hypothesis testing in mixture cure models.

  • Research Article
  • 10.20535/2786-8729.7.2025.338564
Approach to hybrid load management in Fat-Tree web clusters
  • Dec 27, 2025
  • Information, Computing and Intelligent systems
  • Kostiantyn Radchenko + 1 more

The paper presents an approach to hybrid load management in a web cluster that is capable of providing adaptive request balancing based on load prediction and resilience to random web server failures. The proposed architecture is built upon the Fat-Tree topology, which ensures high scalability, structural redundancy, and efficient routing within the cluster network. The developed system performs load forecasting using moving average methods and Erlang-based queueing models, enabling the estimation of overload probabilities and proactive redistribution of computational resources. Four representative simulation scenarios were analyzed: baseline load, peak load, dynamic traffic variations, and random server failures. The obtained results demonstrate enhanced system reliability, reduced average response time, and more balanced utilization of cluster resources. In the context of rapidly growing web services and user traffic volumes, the issue of maintaining high reliability and efficiency of clustered infrastructures becomes increasingly significant. Even with robust topologies such as Fat-Tree, irregular traffic patterns and sudden surges in client requests can cause local overloads and performance degradation. Random node failures further complicate cluster management, necessitating the use of adaptive and predictive control mechanisms. The proposed model integrates Fat-Tree network simulation with statistical forecasting algorithms, forming the basis for proactive load management. This integration allows for minimizing service degradation risks, dynamically responding to workload changes, and maintaining stable operation of web infrastructures under partial node failures. The architecture shows strong potential for real-time implementation in large-scale distributed web systems. It can be further enhanced by incorporating machine learning or wavelet-based forecasting methods to improve the accuracy of load estimation and system adaptability.

  • Research Article
  • 10.1142/s0219455427502117
High efficiency estimation of failure probability: a fusion method of Cross-Entropy optimization and Back propagation Neural Network
  • Dec 24, 2025
  • International Journal of Structural Stability and Dynamics
  • Guizhong Xie + 6 more

The estimation of failure probability in complex engineering structures suffers from low computational efficiency and challenges in handling with high-dimensional nonlinearity. Traditional methods are inefficient and require large sample sizes, making them difficult to meet the demands of practical engineering applications. This paper proposes a failure probability estimation framework that integrates cross-entropy optimization with a back propagation neural network (BPNN). The cross entropy (CE) method is employed to adaptively optimize the sampling distribution, thereby enhancing the sampling efficiency for rare failure domains. By leveraging BPNN’s powerful nonlinear approximation capability, a surrogate model for the limit state function (LSF) is constructed, significantly reducing the number of LSF evaluations. Validation through numerical engineering examples demonstrates that, compared to cross-entropy-based importance sampling (CE-IS) and cross-entropy-based Gaussian mixture sampling (CE-GM) methods, the proposed CE-BPNN approach stably approximates the reference failure probability across varying sample sizes, with lower coefficient of variation (CoV) and mean absolute percentage error (MAPE). The predictive model achieves r-square (R 2 ) values consistently exceeding 0.97. Under identical sample sizes, CE-BPNN exhibits significant accuracy improvement and high stability. The results indicate that the CE-BPNN method offers superior accuracy and efficiency for structural reliability analysis involving high-dimensional nonlinearity and small failure probabilities, providing a promising new approach for reliability assessment of complex engineering structures.

  • Research Article
  • 10.1002/qre.70135
Failure Analysis and Maintenance Planning for Repairable Deteriorating Structural Systems Subject to Imperfect Maintenance
  • Dec 24, 2025
  • Quality and Reliability Engineering International
  • Reza Ahmadi + 1 more

ABSTRACT The analysis of a system that deteriorates under imperfect maintenance is essential in reliability engineering. Imperfect maintenance does not fully restore a system to its original condition, leading to complex degradation patterns. Statistical inference techniques allow for precise estimation of degradation parameters, maintenance effects, and failure probabilities. This study develops a framework to analyze repairable deteriorating systems using advanced probabilistic models and optimization techniques to enhance maintenance planning and failure mitigation.

  • Research Article
  • 10.55041/ijsrem55445
Generative AI for Drug Discovery and Medical Imaging A Simulation-Based Framework for Personalized Treatment Prediction
  • Dec 23, 2025
  • International Journal of Scientific Research in Engineering and Management
  • K Thriveni + 4 more

Abstract This paper presents a reproducible, simulationdriven framework that combines gen- erative artificial intelligence with embedding-based drug representations and lightweight predictive models to demonstrate personalized treatment response estimation and syn- thetic medical-image generation. The design emphasizes modularity, interpretability, and reproducibility. The pipeline integrates synthetic 32-dimensional drug embeddings, a retrain-on-request feedforward classifier for treatment outcome probability estimation, and a compact convolutional generator that synthesizes grayscale medical-like images to accompany numeric predictions. The work is intended as an educational prototype rather than a clinically validated system. Content and architecture draw from the user’s pro- vided project document and are expanded and formalized here for academic presentation. Keywords Generative AI, Drug Discovery, Medical Imaging, Personalized Medicine, Synthetic Data, Women and Child Healthcare (WCH)

  • Research Article
  • 10.3390/buildings16010041
Low-Cost Gas Sensing and Machine Learning for Intelligent Refrigeration in the Built Environment
  • Dec 22, 2025
  • Buildings
  • Mooyoung Yoo

Accurate, real-time monitoring of meat freshness is essential for reducing food waste and safeguarding consumer health, yet conventional methods rely on costly, laboratory-grade spectroscopy or destructive analyses. This work presents a low-cost electronic-nose platform that integrates a compact array of metal-oxide gas sensors (Figaro TGS2602, TGS2603, and Sensirion SGP30) with a Gaussian Process Regression (GPR) model to estimate a continuous freshness index under refrigerated storage. The pipeline includes headspace sensing, baseline normalization and smoothing, history-window feature construction, and probabilistic prediction with uncertainty. Using factorial analysis and response-surface optimization, we identify history length and sampling interval as key design variables; longer temporal windows and faster sampling consistently improve accuracy and stability. The optimized configuration (≈143-min history, ≈3-min sampling) reduces mean absolute error from ~0.51 to ~0.05 on the normalized freshness scale and shifts the error distribution within specification limits, with marked gains in process capability and yield. Although it does not match the analytical precision or long-term robustness of spectrometric approaches, the proposed system offers an interpretable and energy-efficient option for short-term, laboratory-scale monitoring under controlled refrigeration conditions. By enabling probabilistic freshness estimation from low-cost sensors, this GPR-driven e-nose demonstrates a proof-of-concept pathway that could, after further validation under realistic cyclic loads and operational disturbances, support more sustainable meat management in future smart refrigeration and cold-chain applications. This study should be regarded as a methodological, laboratory-scale proof-of-concept that does not demonstrate real-world performance or operational deployment. The technical implications described herein are hypothetical and require extensive validation under realistic refrigeration conditions.

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • .
  • .
  • .
  • 10
  • 1
  • 2
  • 3
  • 4
  • 5

Popular topics

  • Latest Artificial Intelligence papers
  • Latest Nursing papers
  • Latest Psychology Research papers
  • Latest Sociology Research papers
  • Latest Business Research papers
  • Latest Marketing Research papers
  • Latest Social Research papers
  • Latest Education Research papers
  • Latest Accounting Research papers
  • Latest Mental Health papers
  • Latest Economics papers
  • Latest Education Research papers
  • Latest Climate Change Research papers
  • Latest Mathematics Research papers

Most cited papers

  • Most cited Artificial Intelligence papers
  • Most cited Nursing papers
  • Most cited Psychology Research papers
  • Most cited Sociology Research papers
  • Most cited Business Research papers
  • Most cited Marketing Research papers
  • Most cited Social Research papers
  • Most cited Education Research papers
  • Most cited Accounting Research papers
  • Most cited Mental Health papers
  • Most cited Economics papers
  • Most cited Education Research papers
  • Most cited Climate Change Research papers
  • Most cited Mathematics Research papers

Latest papers from journals

  • Scientific Reports latest papers
  • PLOS ONE latest papers
  • Journal of Clinical Oncology latest papers
  • Nature Communications latest papers
  • BMC Geriatrics latest papers
  • Science of The Total Environment latest papers
  • Medical Physics latest papers
  • Cureus latest papers
  • Cancer Research latest papers
  • Chemosphere latest papers
  • International Journal of Advanced Research in Science latest papers
  • Communication and Technology latest papers

Latest papers from institutions

  • Latest research from French National Centre for Scientific Research
  • Latest research from Chinese Academy of Sciences
  • Latest research from Harvard University
  • Latest research from University of Toronto
  • Latest research from University of Michigan
  • Latest research from University College London
  • Latest research from Stanford University
  • Latest research from The University of Tokyo
  • Latest research from Johns Hopkins University
  • Latest research from University of Washington
  • Latest research from University of Oxford
  • Latest research from University of Cambridge

Popular Collections

  • Research on Reduced Inequalities
  • Research on No Poverty
  • Research on Gender Equality
  • Research on Peace Justice & Strong Institutions
  • Research on Affordable & Clean Energy
  • Research on Quality Education
  • Research on Clean Water & Sanitation
  • Research on COVID-19
  • Research on Monkeypox
  • Research on Medical Specialties
  • Research on Climate Justice
Discovery logo
FacebookTwitterLinkedinInstagram

Download the FREE App

  • Play store Link
  • App store Link
  • Scan QR code to download FREE App

    Scan to download FREE App

  • Google PlayApp Store
FacebookTwitterTwitterInstagram
  • Universities & Institutions
  • Publishers
  • R Discovery PrimeNew
  • Ask R Discovery
  • Blog
  • Accessibility
  • Topics
  • Journals
  • Open Access Papers
  • Year-wise Publications
  • Recently published papers
  • Pre prints
  • Questions
  • FAQs
  • Contact us
Lead the way for us

Your insights are needed to transform us into a better research content provider for researchers.

Share your feedback here.

FacebookTwitterLinkedinInstagram
Cactus Communications logo

Copyright 2026 Cactus Communications. All rights reserved.

Privacy PolicyCookies PolicyTerms of UseCareers