Related Topics
Articles published on Historical Data
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
60150 Search results
Sort by Recency
- New
- Research Article
- 10.1108/dta-07-2025-0545
- Jan 21, 2026
- Data Technologies and Applications
- Hoang Thi Minh Chau + 3 more
Purpose Forecasting reservoir water levels plays a critical role in effective water resource management, contributing to the safety of hydraulic infrastructure and mitigating the impacts of droughts and floods. Current forecasting models work solely on satellite imagery, which has limitations on handling noise, particularly in peak values within the time-series streamflow from reservoir operation. Indeed, a multi-modal approach that integrates both satellite imagery and reservoir operation data is necessary to enhance the performance of forecasting. Design/methodology/approach This research presents a novel multi-modal forecasting model that integrates satellite imagery with historical water level data to improve prediction accuracy, particularly in forecasting abrupt changes in water levels. Image features are extracted using the histogram of oriented gradients (HOG) algorithm and normalized with the L2 norm to enhance training stability and reduce noise. A customized fusion function is developed to combine spatial features from satellite imagery with temporal features from water level time series, resulting in a unified composite feature vector. This vector, along with the historical water level sequence, is fed into a gated recurrent unit (GRU) model for forecasting. The fusion mechanism plays a crucial role in capturing sudden and abnormal variations in the data. Findings The model is assessed using satellite images and on-site water level measurements collected at the An Khe and Ka Nak Reservoir, Gia Lai, Vietnam, spanning January 2019 to December 2022. Experimental results demonstrate that the HOG-GRU variant significantly outperforms conventional deep learning models. The specific evaluation metrics are as follows: for the An Khe Reservoir, mean squared error (MSE) (0.08060), root mean squared error (RMSE) (0.28390), mean absolute error (MAE) (0.20446) and |Tracking Signal| (0.00032); whereas for the Ka Nak Reservoir, the corresponding values are MSE (0.20795), RMSE (0.45601), MAE (0.37937) and |Tracking Signal| (0.03985). These findings confirm the model's robustness and its practical applicability to real-world hydrological forecasting tasks. Originality/value This paper presents an original research contribution, offering novel insights to the academic domain of information technology, with all references comprehensively and accurately cited.
- New
- Research Article
- 10.1159/000550280
- Jan 20, 2026
- Transfusion Medicine and Hemotherapy
- Changhong Kong + 6 more
Background: Blood donors may face deferral due to low hemoglobin (Hb) levels, posing a key challenge for blood management. Most prediction studies are based on European data, and their accuracy can be improved, while research on Chinese donors remains limited. Objective: Establish and evaluate the applicability of existing low Hb delay prediction methods to Chinese data and introduce new machine learning models and hyperparameter adjustment methods to optimize prediction schemes. Methods: This study used 26,796 whole blood donation records from Hangzhou (Jan 2023–Oct 2025) to build a machine learning classification model. Seven algorithms, including Logistic Regression and LightGBM, were evaluated using SMOTE for imbalance correction and Hyperband for tuning. Logistic regression and SHAP analysis were then applied to identify key donor characteristics influencing prediction performance. Results: Under the premise of 90% specificity, the LightGBM model had the highest prediction accuracy for blood donation delay due to low Hb. At the same time, the historical average Hb test value of blood donors plays the most important role in the prediction performance of the model, followed by gender, occupation, and type of blood donors (whether to donate blood in groups). In addition, ethnic group also had a significant impact on the prediction of delayed blood donation in Hangzhou area of China. Conclusions: Using historical pre-donation test data and donor personal information enables reliable prediction of low-Hb deferral. The feature importance results based on Hangzhou data were consistent with previous studies, suggesting shared mechanisms and cross-regional model transferability. Moreover, hyperparameter optimization further enhances model performance.
- New
- Research Article
- 10.1007/s10113-026-02526-w
- Jan 20, 2026
- Regional Environmental Change
- Bruno Serranito + 5 more
Abstract Similarly to other abalone species, the ormer ( Haliotis tuberculata ), considered a delicacy, faced multiple anthropogenic pressures, including overfishing from legal and illegal harvesting. While the ormer is extensively studied in aquaculture, limited information has been available on the status of the wild population H. tuberculata in France since the Vibryo harveyi pandemic occurring in the late 1990s. Such a lack of data may contribute in the gradual shift in the perception of stock changes, known as the Shifting Baseling Syndrome (SBS). To address this gap, we combined historical monitoring data collected in the 1980s in the Emerald Coast (North Brittany; France) with recent participatory science diving surveys conducted by a local NGO in the same area. Using quantile regression and mixed models, we investigated changes in density and size structure over three decades, and further explored the ecology of H. tuberculata . Results showed a size-depth relationship, suggesting an age-related vertical distribution. Models indicated no change in density between the two periods, highlighting population recovery from the pandemic that cause high mortality in the region. However, mean individual size declined by approximately 2 cm compared to the 1980s, mainly related to the decline of individuals larger than the legal catch size (9 cm). Such changes could potentially result from anthropogenic pressures including overfishing, ocean warming or acidification. These preliminary findings highlight the interest of combining participatory science initiative along with historical records to tackle shift baseline syndrome, and to inform conservation and management strategies for overlooked and exploited marine species.
- New
- Research Article
- 10.3390/app16021039
- Jan 20, 2026
- Applied Sciences
- Ikhalas Fandi + 1 more
In the modern era, demand forecasting enhances the decision-making tasks of industries for controlling production planning and reducing inventory costs. However, the dynamic nature of the fashion and apparel retail industry necessitates precise demand forecasting to optimize supply chain operations and meet customer expectations. Consequently, this research proposes the Formicary Zebra Optimization-Based Distributed Attention-Guided Convolutional Recurrent Neural Network (FZ-DACR) model for improving the demand forecasting. In the proposed approach, the combination of the Formicary Zebra Optimization and Distributed Attention mechanism enabled deep learning architectures to assist in capturing the complex patterns of the retail sales data. Specifically, the neural networks, including convolutional neural networks (CNNs) and recurrent neural networks (RNNs), facilitate extracting the local features and temporal dependencies to analyze the volatile demand patterns. Furthermore, the proposed model integrates visual and textual data to enhance forecasting accuracy. By leveraging the adaptive optimization capabilities of the Formicary Zebra Algorithm, the proposed model effectively extracts features from product images and historical sales data while addressing the complexities of volatile demand patterns. Based on extensive experimental analysis of the proposed model using diverse datasets, the FZ-DACR model achieves superior performance, with minimum error values including MAE of 1.34, MSE of 4.7, RMS of 2.17, and R2 of 93.3% using the DRESS dataset. Moreover, the findings highlight the ability of the proposed model in managing the fluctuating trends and supporting inventory and pricing strategies effectively. This innovative approach has significant implications for retailers, enabling more agile supply chains and improved decision making in a highly competitive market.
- New
- Research Article
- 10.3389/fevo.2025.1695457
- Jan 20, 2026
- Frontiers in Ecology and Evolution
- Alex B Shupinski + 7 more
Introduction Many species are shifting their geographic ranges in response to changing climate, and identifying climate impacts on future species distributions will be critical for conservation success. North American bison (Bison bison) provide an exceptional study system for exploring the use of an interdisciplinary record of paleontological, archaeological, and historical data for conservation due to the plethora of past occurrences across a large geographic and temporal scale, in combination with their “near-threatened” designation by the IUCN Red List because of current small, fragmented populations following a near-extinction event in the 1880s. Moreover, the multiple identities of bison as free-roaming wildlife, as wildlife with limitations, and as captive semi-domesticated livestock introduce unique conservation concerns across the four sectors of the Bison Management System (BMS; Tribal, private, public, nonprofit-NGO). Methods To model bison climate suitability using “Bioclim”, we associated 1,774 bison occurrences over the last 21,000 years with three PastClim variables (warmest temperature of the warmest month, temperature seasonality, and precipitation of the coldest quarter) that were identified as the strongest predictors of past bison distributions using a variance inflation factor. The model was projected onto the WorldClim RCP4.5 and RCP8.5 future climate scenarios for the four remaining 20-yearperiods to 2100 CE and onto the WorldClim 2.1 version of current climate, to determine expected changes in climate suitability. Results The distribution of suitability scores changes rapidly, shifting significantly between each 20-year interval until the end of the century. By 2100, the centroid of suitable climate, using the standard 50% threshold, is expected to shift from its current location near the 49th parallel to the northwest and toward the northern border of Canada by 1,182 km under the RCP4.5 climate scenario and 2,254 km under the RCP8.5 climate scenario. Suitability ranges above the optimal minimal threshold identified by the receiving operator characteristic (8.5%) are also predicted to shift to the northwest by 793 km under RCP4.5. and 1267 km under RCP8.5. Discussion With an anticipated geographic shift in the most suitable bison climate, it is necessary to prepare future management strategies for BMS sectors to maintain a sustainable relationship with bison.
- New
- Research Article
- 10.1145/3748819
- Jan 20, 2026
- ACM Transactions on Cyber-Physical Systems
- Shiqing Li + 4 more
Cross-domain communication systems are important in multiple areas such as smart factories and automatic driving. Derived from real-life scenarios, each domain has a local network and accesses the public network via a central server, which makes it easy to manage the local network and protect entities in the domain. Based on this, a cross-domain communication system follows the “ device-server-server-device ” pattern. For each communication session, there are two steps: authentication and communication. The authentication phase helps two parties to build a secure communication channel. If the server is compromised, the authentication request can be sent to the attacker and then the attacker can impersonate the original receiver. As a result, it is necessary to effectively solve this server-compromise case. Further, anomaly detection is required to monitor the behaviors of entities in domains. Current deep learning solutions deploy autoencoder-based models to reconstruct input data and then detect anomalies. However, they feed raw input data into the models, which makes it hard to reconstruct evolving data streams. In this work, we focus on the detection of compromised entities for cross-domain communication systems. Specifically, we split the problem into two parts: the server-compromise case in the authentication phase and anomaly detection for the whole system. Toward the server-compromise case, we propose a blockchain-based “double verification” scheme to prevent the server from making decisions on its own. Specifically, nodes in the blockchain network evaluate submitted records using the public key infrastructure. Meanwhile, the distributed nature of blockchain allows us to deploy more machines as verifiers to monitor the behavior of servers. For anomaly detection, we propose to feed the degree of change of items instead of raw item values into deep learning models, which is more adaptive to evolving data streams. We propose to use the linear combination of historical data and recent data to compute the degree of change. Finally, we analyze the security properties of the proposed system and evaluate the proposed anomaly detection method using real datasets. We build a simple cross-domain communication system using the Fabric framework to simulate the “double verification” scheme and the proposed anomaly detection method achieves around 0.11 accuracy improvement on average.
- New
- Research Article
- 10.1007/s41060-026-01020-0
- Jan 19, 2026
- International Journal of Data Science and Analytics
- Wei-Cheng Wang + 5 more
Abstract In recent years, the rapid growth in scholarly publications and the widespread adoption of digital libraries have intensified the demand for effective paper recommendation systems. Traditional approaches typically rely on extensive historical interaction data or rich contextual information from paper abstracts, but often overlook critical temporal dependencies inherent in user preferences. A particular challenge is the recommendation of newly published papers, which, despite their significance in conveying cutting-edge research findings, suffer from sparse historical data. To address these issues, we propose the Meta-path Attention with Semantic Transformer for Academic Recommendation (MAPSTAR) framework, a novel recommendation model that integrates heterogeneous graph attention with transformer-based meta-path attention mechanisms. MAPSTAR simultaneously models both the temporal sequences of user interactions and the complex correlations among papers and their attributes. Specifically, our approach introduces a Transformer Encoder within the meta-path attention layer, allowing each meta-path embedding to capture global dependencies and dynamically adjust its representation based on contextual interactions with other meta-paths.
- New
- Research Article
- 10.1007/s10661-026-14981-3
- Jan 19, 2026
- Environmental monitoring and assessment
- Mohamed T Elnabwy + 3 more
Assessing changes in mean sea level (MSL) has become increasingly critical due to the significance of climate changes. Soft computing techniques are now widely used to reduce the time and cost associated with traditional MSL estimation methods. Historical MSL data is frequently used to predict future values, yet the application of soft computing models to analyze climate change's impact on MSL remains relatively unexplored. This study aims to develop and compare various soft computing techniques for modeling MSL fluctuations using meteorological data. Random forest (RF), support vector regression (SVR), K-nearest neighbors (KNN) regression, deep neural network (DNN), Gaussian process regression (GPR), and stacked ensemble methods are employed in this study. The newly developed models are statistically assessed for their effectiveness in modeling MSL at Damietta station, Egypt. Variables environmental data such as surface water temperature, pressure, air temperature (average air temperature, dewpoint, wet-bulb, and heat index), and humidity and wind attributes (speed and direction) are utilized and evaluated in modeling MSL. The results indicate that RF, KNN, and GP outperformed other proposed models in modeling MSL during both training and testing phases. The developed weighted stacked ensemble model, integrating RF, KNN, and GPR, outperformed the base models with a correlation coefficient (R) of 0.88 and normalized root mean square error (RMSE) of 0.056 m. MSL modeling at the study station was particularly sensitive to variations in water temperature, wind speed and direction, and atmospheric pressure. This methodology serves as a valuable framework for climate-driven MSL forecasting in developing coastal regions lacking long-term tide records, directly contributing to UNESCO's Ocean Decade Challenge 5 on coastal resilience.
- New
- Research Article
- 10.3390/math14020331
- Jan 19, 2026
- Mathematics
- Hua Deng + 1 more
Uncertainty in optimization models often causes awkward properties in their deterministic equivalent formulations (DEFs), even for simple linear models. Chance-constrained programming is a reasonable tool for handling optimization problems with random parameters in objective functions and constraints, but it assumes that the distribution of these random parameters is known, and its DEF is often associated with the complicated computation of multiple integrals, hence impeding its extensive applications. In this paper, for optimization models with chance constraints, the historical data of random model parameters are first exploited to construct an adaptive approximate density function by incorporating piecewise linear interpolation into the well-known histogram method, so as to remove the assumption of a known distribution. Then, in view of this estimation, a novel confidence set only involving finitely many variables is constructed to depict all the potential distributions for the random parameters, and a computable reformulation of data-driven distributionally robust chance constraints is proposed. By virtue of such a confidence set, it is proven that the deterministic equivalent constraints are reformulated as several ordinary constraints in line with the principles of the distributionally robust optimization approach, without the need to solve complicated semi-definite programming problems, compute multiple integrals, or solve additional auxiliary optimization problems, as done in existing works. The proposed method is further validated by the solution of the stochastic multiperiod capacitated lot-sizing problem, and the numerical results demonstrate that: (1) The proposed method can significantly reduce the computational time needed to find a robust optimal production strategy compared with similar ones in the literature; (2) The optimal production strategy provided by our method can maintain moderate conservatism, i.e., it has the ability to achieve a better trade-off between cost-effectiveness and robustness than existing methods.
- New
- Research Article
- 10.1145/3788286
- Jan 19, 2026
- ACM Transactions on Software Engineering and Methodology
- Bing Zhang + 4 more
SQL injection-based adversarial attacks can directly evaluate WAFs by observing block/allow actions, yet existing methods have four key issues: low quality and diversity of payloads, inadequate mutation strategies, semantic inequivalence of mutated payloads, and inefficient search processes for generating such payloads. We hypothesize that a method simultaneously improving these aspects would yield more effective attacks. Thus, we propose BWAFSQLi, a general and extensible framework for adversarial SQLi-based WAF bypass. It first designs a convergence-factor-guided context-free grammar to generate high-quality, diverse payloads (covering 18 attack scenarios, targeting 58 rules). For detected payload tokens, BWAFSQLi applies 26 rules with 15 mutation strategies—including two novel techniques (Quotation Mark Encoding and Comment Extension)—to ensure semantic-equivalent mutations. A mutation strategy selection mechanism, integrating a decay factor and historical data table, enables adaptive multi-position mutations for efficient exploration while reducing requests. Mutated payloads are finally evaluated via HTTP requests against target WAFs. Experiments with one self-built dataset (SQLiCFG) and two public datasets (HPD, SIK) on 11 WAFs (3 gray-box, 8 black-box) show BWAFSQLi increases WAF‘s false negative rates (FNR) by up to 93.39% (gray-box) and 58.49% (black-box) with minimal-requests, surpassing three SOTA methods. Applying seven proposed preprocessing defenses fully suppresses FNR increases, highlighting practical significance.
- New
- Research Article
- 10.55506/icdess.v3i1.141
- Jan 18, 2026
- Proceeding International Conference on Digital Education and Social Science
- Febriyantina Istiara + 2 more
This study explores the transformative potential of deep learning in enhancing curriculum design and learning processes within modern educational environments. Drawing on a qualitative research design, the study employs documentary and thematic analysis to synthesize insights from scholarly literature, policy documents, and empirical studies on artificial intelligence (AI) in education. The findings reveal that deep learning enables the development of adaptive, data-driven curriculum frameworks capable of responding to diverse learner needs and rapidly evolving knowledge landscapes. Deep learning technologies also support personalized learning pathways by analyzing real-time and historical student data to identify learning gaps, predict performance, and recommend targeted instructional interventions. Moreover, the analysis highlights the role of deep learning in improving instructional practices through intelligent feedback, automated assessment, and enhanced learner engagement. However, several challenges remain, including issues related to data privacy, algorithmic bias, teacher readiness, and infrastructural limitations. Addressing these concerns is essential for ensuring equitable and responsible integration of deep learning technologies. Overall, the study concludes that deep learning offers significant promise for reimagining curriculum and instruction, provided that implementation is carefully planned and supported by appropriate pedagogical, ethical, and institutional frameworks.
- New
- Research Article
- 10.3390/su18020977
- Jan 18, 2026
- Sustainability
- Weikai Yan + 3 more
As the supply chain of the electric vehicle (EV) industry becomes increasingly complex and vulnerable, traditional supplier evaluation methods reveal inherent limitations. These approaches primarily emphasize static performance while neglecting dynamic future risks. To address this issue, this study proposes a comprehensive supplier evaluation model that integrates a hybrid Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) framework with the Extreme Gradient Boosting (XGBoost) algorithm, contextualized for the EV sector. The hybrid AHP-TOPSIS framework is first applied to rank suppliers based on multidimensional performance criteria, including quality, delivery capability, supply stability and scale. Subsequently, the XGBoost algorithm uses historical monthly data to capture nonlinear relationships and predict future supplier risk probabilities. Finally, a risk-adjusted framework combines these two components to construct a dynamic dual-dimensional performance–risk evaluation system. A case study using real data from an automobile manufacturer demonstrates that the hybrid AHP–TOPSIS model effectively distinguishes suppliers’ historical performance, while the XGBoost model achieves high predictive accuracy under five-fold cross-validation, with an AUC of 0.851 and an F1 score of 0.928. After risk adjustment, several suppliers exhibiting high performance but elevated risk experienced significant declines in their overall rankings, thereby validating the robustness and practicality of the integrated model. This study provides a feasible theoretical framework and empirical evidence for EV enterprises to develop supplier decision-making systems that balance performance and risk, offering valuable insights for enhancing supply chain resilience and intelligence.
- New
- Research Article
- 10.1177/20515707251406496
- Jan 18, 2026
- Recherche et Applications en Marketing (English Edition)
- Damien Chaney + 1 more
The use of historical data and approaches in marketing research is becoming increasingly common. However, the lack of methodological unity still limits their development and systematic adoption within the field. To address this issue, we begin by defining historical methods as the set of techniques that rely on past sources to collect, verify, and interpret traces or practices, with the aim of providing a contextualized and critical explanation of the phenomena under study. These past data may be primary, secondary, or reconstructed. Building on a theory-to-history perspective, we further structure and clarify the contributions of historical data and approaches to marketing literature. Two dimensions guide our framework – the temporal perspective (diachronic vs synchronic) and the level of analysis (micro vs macro) – which together generate four distinct conceptions: Narrating, Observing, Tracing, and Mapping. Collectively, these conceptions open new avenues for thinking about marketing research over time.
- New
- Research Article
- 10.1002/sd.70593
- Jan 18, 2026
- Sustainable Development
- Afzal Ahmed Dar + 6 more
ABSTRACT Amid the quest for sustainable agriculture, this study explores key ecological and technological factors influencing crop production under climate change. We conduct a comprehensive assessment of temperature, biomass, farmer education, renewable energy devices, greenhouse gas emissions and their effects on rice yields in Granma, Cuba, from 1989 to 2022. The novelty of this study lies in developing precise forecasting models via machine learning techniques, including Bayesian neural network, support vector machines, and pattern recognition neural network that integrate multi‐variable historical data for highly accurate predictions. Furthermore, farmer education plays a vital role in adopting these technologies, enhancing overall productivity. Quantile autoregressive distributed lag analysis reveals that biomass boosts long‐run rice production by 1.54%, while a 1% increase in greenhouse gases reduces it by 0.2%. Rising temperature significantly lowers short‐term yields by 0.48% per 1% increase, which is positively offset by renewable energy devices. This approach not only reduces environmental impact but also ensures long‐term food security in the region. The study's contributions include policy recommendations that align sustainable development goals with strategies to cut emissions through enhanced water management, low‐emission techniques, and low‐energy sources. The adoption of machine learning and smart practices like heat‐tolerant cultivars and controlled fertilization is recommended to mitigate ecological risks. Additionally, investing in education and extension services can empower farmers to implement these practices effectively. For policymakers, key strategies encompass improved water management, low‐emission practices, renewable energy adoption, education investments, and climate‐smart farming practices to foster sustainable rice production. The study highlights the importance of integrating technology with ecological considerations for sustainable agriculture. Overall, our findings provide a roadmap for other regions facing similar challenges.
- New
- Research Article
- 10.1186/s12936-026-05787-2
- Jan 17, 2026
- Malaria journal
- Bilal Ahmad Rahimi + 5 more
According to the World Health Organization (WHO), Afghanistan has the world's seventh-highest reported malaria burden, outside Africa. The objectives of this study were to determine the historical trend of malaria in Afghanistan over the past three decades, conduct a forecast analysis using best-fitted predictive models based on historical data, and conduct district-level stratification of malaria-endemicity to inform customised control and elimination intervention strategies for ending malaria in Afghanistan. District-level monthly malaria incidence data from 2018 to 2023 were obtained from the Afghanistan Ministry of Public Health. The annual malaria cases from 1990 to 2023 were obtained from the WHO. From 2018 to 2023, the test positivity rate (TPR) was 15.2%, with 96.5% of cases being Plasmodium vivax, 3.2% being P. falciparum, and 0.3% having mixed infections of P. vivax and P. falciparum. The mean annual malaria cases from 2018 to 2023 were 153,295 per year. Nearly one-fourth of total malaria cases occurred in districts along the Pakistan border, which also comprised most of the country's highly endemic areas. In contrast, districts bordering Iran, Turkmenistan, Uzbekistan, Tajikistan, and China together contributed less than 1% of cases. The annual trends of the time series data on malaria cases in Afghanistan from 1990 to 2023 exhibited a non-linear cyclic trend, with the highest cases reported in 2002. The number of cases then steadily declined until 2013, after which they further increased in the following 4years, 2014-2017. Forecasting analysis suggests that the country is unlikely to achieve malaria elimination by 2035 under the current intervention policy. About 43% of the districts with zero or low malaria endemicity are ready for sub-national malaria elimination. There is a need for context-specific strategies of vector control and case management to eliminate malaria in moderate- and high-endemic districts. The districts along the Afghan-Pakistan border are hosting the majority of infections and require effective cross-border collaboration between the two countries to meet their malaria elimination goals.
- New
- Research Article
- 10.3390/jrfm19010072
- Jan 16, 2026
- Journal of Risk and Financial Management
- Khalid Jeaab + 3 more
Financial crises increasingly exhibit complex, interconnected patterns that traditional risk models fail to capture. The 2008 global financial crisis, 2020 pandemic shock, and recent banking sector stress events demonstrate how systemic risks propagate through multiple channels simultaneously—e.g., network contagion, extreme co-movements, and information cascades—creating a multidimensional phenomenon that exceeds the capabilities of conventional actuarial or econometric approaches alone. This paper addresses the fundamental challenge of modeling this multidimensional systemic risk phenomenon by proposing a mathematically formalized three-tier integration framework that achieves 19.2% accuracy improvement over traditional models through the following: (1) dynamic network-copula coupling that captures 35% more tail dependencies than static approaches, (2) semantic-temporal alignment of textual signals with network evolution, and (3) economically optimized threshold calibration reducing false positives by 35% while maintaining 85% crisis detection sensitivity. Empirical validation on historical data (2000–2023) demonstrates significant improvements over traditional models: 19.2% increase in predictive accuracy (R2 from 0.68 to 0.87), 2.7 months earlier crisis detection compared to Basel III credit-to-GDP indicators, and 35% reduction in false positive rates while maintaining 85% crisis detection sensitivity. Case studies of the 2008 crisis and 2020 market turbulence illustrate the model’s ability to identify subtle precursor signals through integrated analysis of network structure evolution and semantic changes in regulatory communications. These advances provide financial regulators and institutions with enhanced tools for macroprudential supervision and countercyclical capital buffer calibration, strengthening financial system resilience against multifaceted systemic risks.
- New
- Research Article
- 10.3390/en19020448
- Jan 16, 2026
- Energies
- Xiao Liao + 4 more
With the rapid advancement of artificial intelligence (AI) technology, training deep neural networks has become a core computational task that consumes significant energy in data centers. Researchers often employ various methods to estimate the energy usage of data center clusters or servers to enhance energy management and conservation efforts. However, accurately predicting the energy consumption and carbon footprint of a specific AI task throughout its entire lifecycle before execution remains challenging. In this paper, we explore the energy consumption characteristics of AI model training tasks and propose a simple yet effective method for predicting neural network training energy consumption. This approach leverages training task metadata and applies genetic programming-based symbolic regression to forecast energy consumption prior to executing training tasks, distinguishing it from time series forecasting of data center energy consumption. We have developed an AI training energy consumption environment using the A800 GPU and models from the ResNet{18, 34, 50, 101}, VGG16, MobileNet, ViT, and BERT families to collect data for experimentation and analysis. The experimental analysis of energy consumption reveals that the consumption curve exhibits waveform characteristics resembling square waves, with distinct peaks and valleys. The prediction experiments demonstrate that the proposed method performs well, achieving mean relative errors (MRE) of 2.67% for valley energy, 8.42% for valley duration, 5.16% for peak power, and 3.64% for peak duration. Our findings indicate that, within a specific data center, the energy consumption of AI training tasks follows a predictable pattern. Furthermore, our proposed method enables accurate prediction and calculation of power load before model training begins, without requiring extensive historical energy consumption data. This capability facilitates optimized energy-saving scheduling in data centers in advance, thereby advancing the vision of green AI.
- New
- Research Article
- 10.1175/jamc-d-25-0128.1
- Jan 16, 2026
- Journal of Applied Meteorology and Climatology
- Weixuan Rosa Xu + 3 more
Abstract Bias correction (BC) is commonly used to refine dynamical climate model results for use in applied research. However, uncertainties linked to the BC of multivariate meteorological indices, crucial for many applied research applications, are often underappreciated. Focusing on daily maximum values of the multivariate Heat Index (HI, a function of temperature and relative humidity), this study investigates uncertainties across four BC workflow configurations available to users with varying expertise levels and data availability. Univariate Quantile Delta Mapping (QDM) and Multivariate Bias Correction (MBC) were used to bias-correct nine global climate models trained on 30 years of data from 13 Northeast U.S. weather stations. Two workflows use QDM to directly BC HI variables. Two workflows adopt different component-wise approaches to BC temperature and humidity, computing HI from those bias-corrected outputs. Workflow accuracy is assessed by comparing extreme HI day occurrences against reference station data. The best results are produced by the QDM-direct workflow, when hourly historical station data provides accurate daily maximum HI for training. Conversely, the common component-wise QDM (QDMc) workflow that independently adjusts temperature and humidity, yields errors up to 147%, due to lost inter-variable dependence. The MBCn component-wise approach, which jointly adjusts temperature and humidity, better preserves correlations than QDMc, yielding more reliable results. These findings underscore that BC methodological choices, including the order of operations, training data accuracy, and whether variable dependence is preserved, markedly impact accuracy. This offers practical guidance for researchers and practitioners quantifying climate risks using bias-corrected climate model data.
- New
- Research Article
- 10.1108/jm2-08-2025-0415
- Jan 16, 2026
- Journal of Modelling in Management
- Rashid Khalil
Purpose This study aims to explore the predictive role of artificial intelligence (AI)-driven sentiment analysis in financial markets by developing a hybrid long short-term memory–Random Forest framework. It investigates whether the integration of generative sentiment signals with historical market data can enhance the accuracy and robustness of stock price forecasting and financial predictions across various industry sectors. Design/methodology/approach This research uses a multisource data set from 2019 to 2024, including stock price data from Yahoo Finance, macroeconomic indicators from Federal Reserve Economic Data and textual sentiment from Reddit, Twitter, Bloomberg and Reuters. Transformer-based natural language processing models, such as FinBERT, are used to quantify sentiment, which is then used as a predictive feature in machine learning models. Granger causality analysis and accuracy metrics are applied to evaluate sectoral variations in sentiment impact. Findings Empirical analysis reveals that social media sentiment Granger causes short-term stock movements in technology and finance sectors, with the hybrid model achieving 68.5% directional accuracy and a 22% reduction in prediction error compared to ARIMA models benchmarks. In contrast, sectors like healthcare and energy show minimal sensitivity to sentiment, underscoring the need for domain-specific strategies. This study also identifies ethical concerns related to sentiment manipulation, transparency and AI governance in financial contexts. Originality/value This research introduces a reproducible, cross-sectoral forecasting framework that bridges AI, sentiment analysis and finance. The proposed architecture offers practical forecasting enhancements and contributes to ethical discourse on AI use in high-stakes financial environments, with implications for regulators, analysts and portfolio managers.
- New
- Research Article
- 10.1007/s11356-025-37371-7
- Jan 16, 2026
- Environmental science and pollution research international
- Micaela A Mujica + 9 more
The Conchitas River, located in the southeastern Buenos Aires Metropolitan Area, has experienced severe degradation due to two decades of unregulated urbanization, industrial effluents, and deficient wastewater management. This study aims to assess long-term changes in water quality in the middle basin of the river from 2002 to 2023. We integrated data from four historical studies with new fieldwork conducted in 2022-2023, harmonizing sampling sites, units, and parameters, evaluating physicochemical, bacteriological, and biological parameters, and complementing them with a GIS-based multitemporal land-use analysis. Despite the lack of standardization of historical data, the results revealed an urban expansion since 1998 consistent with downstream declines in water quality, with conductivity rising from 629 to 1765 µS cm-1, dissolved oxygen decreasing from 10.24 to 0.99mg L-1, and BOD₅ peaking at 123mg O₂ L-1. Fecal coliforms exceeded legal limits across all sites, and diatom assemblages were dominated by highly tolerant taxa. The Water Quality Index (WQI) declined from 67.7 upstream to 37.3 near the mouth. These trends highlight chronic pollution and the urgent need for integrated watershed management and investment in sanitation infrastructure.