Published in last 50 years
Related Topics
Articles published on Entropy Criterion
- New
- Research Article
- 10.3390/jmse13112055
- Oct 27, 2025
- Journal of Marine Science and Engineering
- Zhou Zhou + 4 more
This study proposes a hybrid sea level prediction model by coupling a dynamically optimized variational mode decomposition (VMD) with a convolutional bidirectional gated recurrent unit (CNN-BiGRU). The VMD decomposition is fine-tuned using the grey wolf optimizer and evaluated via entropy criteria to minimize mode mixing. The resulting components are processed by CNN-BiGRU to capture spatial features and temporal dependencies, and predictions are reconstructed from the integrated outputs. Validated on monthly sea level data from Kanmen and Zhapo stations, the model achieves high accuracy with an RMSE of 13.857 mm and 16.230 mm, MAE of 10.659 mm and 13.129 mm, and NSE of 0.986 and 0.980. With a 6-month time step, the proposed strategy effectively captures both periodic and trend signals, demonstrating strong dynamic tracking and error convergence. It significantly improves prediction accuracy and provides reliable support for storm surge warning and coastal management.
- New
- Research Article
- 10.1080/02533839.2025.2574012
- Oct 26, 2025
- Journal of the Chinese Institute of Engineers
- M Anisha Vergin + 2 more
ABSTRACT The secure storage of sensitive medical data in cloud environments presents significant engineering challenges related to privacy, integrity, and computational efficiency. This paper introduces a novel hybrid encryption framework that integrates Deoxyribonucleic Acid (DNA) based cryptographic encoding with Homomorphic Encryption (HE), enhanced by the Enhanced Seahorse Optimizer (ESHO) for optimized cryptographic key generation. The proposed system transforms medical images into DNA sequences and encrypts them using HE and provides secure computation without exposing the original content. ESHO is employed for selecting optimal encryption keys based on entropy and sensitivity criteria. This improves key strength and minimizes computation time. Experimental evaluations using medical datasets demonstrate high entropy (7.99), NPCR (99%), and UACI (33%), as well as strong resilience against noise and cropping attacks. The results indicate that the proposed DNA-HE-ESHO framework offers a robust, efficient, and scalable solution for secure medical image storage in cloud computing environments.
- New
- Research Article
- 10.1038/s41598-025-19360-5
- Oct 9, 2025
- Scientific Reports
- Jay Vora + 5 more
Titanium alloys have exceptional hardness and high toughness, which can cause significant challenges in traditional machining. Wire-electrical discharge machining (WEDM) process offers excellent accuracy and high precision compared to conventional machines. Design of experimental (DOE) technique provides a systematic way to conduct the experimental runs with the least trials by saving time and cost. Thus, the current work focuses on the modelling of WEDM process at numerous input process environments using Taguchi and BBD-RSM approach. The variable input factors of WEDM process include pulse-on-time (Ton), pulse current (Ip), and pulse-off-time (Toff), whereas the response measures of material removal rate (MRR) and surface roughness (SR) were taken. The performance and adequacy of Taguchi and BBD-RSM models were assessed by using ANOVA, coefficient of determination (R2), and residual plots. The effect of WEDM factors on performance measures was studied by using main effect plots. Based on Entropy criterion, the weights of MRR and SR response factors were computed to, in turn, 0.52 and 0.48. The practical tests defined in the DOE along with the MRR and SR were considered as inputs to the Naive Bayes (NB) predictive model. The prediction findings indicated the appropriate performance of the NB algorithm. The authors believe that the present study, which compares DOE techniques and their application in predicting process outcomes using Naive Bayes classifier, will be useful for users in different domains and various applications.
- Research Article
- 10.23939/ictee2025.02.140
- Oct 1, 2025
- Information and communication technologies, electronic engineering
- I Horbatyi + 1 more
The article examines modern methods for recognizing and analyzing radio signals with partial and complete spectral overlap, which represent one of the key and most complex problems in the field of radio monitoring, telecommunications, and technical intelligence. It describes the theoretical foundations and practical aspects of applying the Fast Fourier Transform method, the principal component analysis method, and the independent component analysis method for separating, identifying, and classifying signals in complex conditions. The fast Fourier transform method demonstrated high efficiency in processing partially overlapping signals, as it enables the determination of main frequency components even at a low signal-to-noise ratio and in the presence of significant interference. For complete spectral overlap, a combined approach is proposed that integrates principal component analysis and independent component analysis, providing preliminary signal decorrelation and subsequent separation according to the statistical independence criterion. A key improvement is the introduction of the spectral entropy criterion, which is based on assessing the level of randomness of the signal’s energy distribution in the frequency domain. High entropy values indicate significant noisiness or random structure, while low values indicate the presence of pronounced frequency components. Using this criterion makes it possible to automatically select the most informative components, removing insignificant noise components and reducing computational costs. A series of numerical experiments was carried out, a quantitative assessment of signal separation accuracy was performed, and the influence of noise level and degree of spectrum overlap on the final results was analyzed. The proposed approach can be adapted for a wide range of tasks, including automated technical monitoring systems, radio intelligence systems, and tools for detecting low-visibility signals. The results confirm the feasibility of its use in adaptive next-generation digital signal processing systems and its potential for the development of intelligent radio monitoring algorithms.
- Research Article
- 10.3390/e27100998
- Sep 24, 2025
- Entropy
- Pavel Lysenko + 3 more
The paper considers the problem of detecting and classifying acoustic signals based on information (entropy) criteria. A number of new information features based on time-frequency distributions are proposed, which include the spectrogram and its upgraded version, the reassigned spectrogram. To confirm and verify the proposed characteristics, modeling on synthetic signals and numerical verification of the solution of the multiclass classification problem based on machine learning methods on real hydroacoustic recordings are carried out. The obtained high classification results () allow us to assert the advantages of using the proposed characteristics.
- Research Article
- 10.33096/ilkom.v17i2.2585.186-195
- Aug 20, 2025
- ILKOM Jurnal Ilmiah
- Mardewi Mardewi + 3 more
This study evaluates the performance of Decision Tree methods in classification, utilizing three different criteria: Entropy, Gini, and Log Loss. The objective is to determine which criterion is most effective in achieving high classification accuracy using prescription data from the UCI repository, comprising 3,424 prescription records with 67 variables. The analysis results show that the Entropy criterion delivers the best performance with an accuracy of 79.1%, followed by the Gini criterion at 78%, and the Log Loss criterion at 77.9%. These findings indicate that the Entropy criterion is superior in reducing uncertainty and capturing the underlying data structure, while both Gini and Log Loss criteria also provide competitive, though slightly lower, results. The main contribution of this research is a comparative evaluation of decision tree criteria using real-world prescription data to support accurate classification of medication adherence, which can be beneficial for developing intelligent pharmacy systems. This research offers valuable insights into the effectiveness of various criteria within the Decision Tree method and can aid in selecting the most appropriate criterion for future classification applications.
- Research Article
- 10.3390/rs17152699
- Aug 4, 2025
- Remote Sensing
- Jianmin Hu + 7 more
In order to achieve 0.1 m resolution and fully polarimetric observation capabilities for airborne SAR systems, the adoption of stepped-frequency modulation waveform combined with the polarization time-division transmit/receive (T/R) technique proves to be an effective technical approach. Considering the issue of range resolution degradation and paired echoes caused by multichannel amplitude–phase mismatch in fully polarimetric airborne SAR with 0.1 m resolution, an amplitude–phase error estimation algorithm based on echo data is proposed in this paper. Firstly, the subband amplitude spectrum correction curve is obtained by the statistical average of the subband amplitude spectrum. Secondly, the paired-echo broadening function is obtained by selecting high-quality sample points after single-band imaging and the nonlinear phase error within the subbands is estimated via Sinusoidal Frequency Modulation Fourier Transform (SMFT). Thirdly, based on the minimum entropy criterion of the synthesized compressed pulse image, residual linear phase errors between subbands are quickly acquired. Finally, two-dimensional cross-correlation of the image slice is utilized to estimate the positional deviation between polarization channels. This method only requires high-quality data samples from the echo data, then rapidly estimates both intra-band and inter-band amplitude/phase errors by using SMFT and the minimum entropy criterion, respectively, with the characteristics of low computational complexity and fast convergence speed. The effectiveness of this method is verified by the imaging results of the experimental data.
- Research Article
- 10.1016/j.vaccine.2025.127536
- Aug 1, 2025
- Vaccine
- Polya Genova + 3 more
Trust in government, science, and vaccine confidence in Southeast Asia: A latent profile analysis.
- Research Article
- 10.5875/ausmt.v7i2.1201
- Jul 9, 2025
- International Journal of Automation and Smart Technology
- Vahid Kazemi Golbaghi + 3 more
There are several techniques that can be used to determine the condition of a rolling element bearing. In this paper, vibration analysis is used to conduct fault diagnosis of a bearing. Vibration signal noise was eliminated using hard thresholding wavelet analysis. The best mother wavelet for the denoising process was selected using the minimum Shannon entropy criterion. Statistical parameters and other signal properties such as energy and entropy are powerful tools for analyzing vibration signals. These features were calculated in the time and wavelet domains and applied to Artificial Neural Networks (ANNs) as the feature vector to classify the condition of a bearing into one healthy and three faulty conditions. The ANN parameters were separately optimized using three optimization algorithms. The comparison of the results shows that if the ANN parameters are properly optimized, the statistical parameters in the time-frequency domain can optimize accuracy.
- Research Article
- 10.1038/s41598-025-07644-9
- Jul 3, 2025
- Scientific Reports
- Serena Castellotti + 3 more
The human visual system processes a massive amount of visual information very rapidly, requiring efficient coding mechanisms to handle such data within physiological constraints. The biological foundations of these mechanisms remain poorly understood. One hypothesis suggests that the visual system prioritizes the encoding of specific features of natural scenes optimized to maximize information transfer while minimizing computational costs (constrained-maximum entropy criteria). This study aims to identify a possible neural marker of this prioritizing mechanism. Participants were briefly shown stimuli with varying proportions of such optimal features, while EEG visual evoked response was recorded. Analysis focused on the C1 component, the earliest visual evoked component commonly considered to mainly reflect the first cortical response in the primary visual cortex (V1). Results revealed that the C1 component peaks earlier when elicited by optimal features, with a proportional speed-up following the gradual increase of their number. This provides evidence for an early efficient selection of optimally informative visual features in humans.
- Research Article
- 10.3390/machines13070558
- Jun 27, 2025
- Machines
- Ruibin Gao + 4 more
To address the issues of information redundancy, limited feature representation, and empirically set parameters in rolling bearing fault diagnosis, this paper proposes a Multi-Entropy Screening and Optimization Temporal Convolutional Network (MESO-TCN). The method integrates feature filtering, network modeling, and parameter optimization into a unified diagnostic framework. Specifically, ensemble empirical mode decomposition (EEMD) is combined with a hybrid entropy criterion to preprocess the raw vibration signals and suppress redundant noise. A kernel-extended temporal convolutional network (ETCN) is designed with multi-scale dilated convolution to extract diverse temporal fault patterns. Furthermore, an improved whale optimization algorithm incorporating a firefly-inspired mechanism is introduced to adaptively optimize key hyperparameters. Experimental results on datasets from Xi’an Jiaotong University and Southeast University demonstrate that MESO-TCN achieves average accuracies of 99.78% and 95.82%, respectively, outperforming mainstream baseline methods. These findings indicate the method’s strong generalization ability, feature discriminability, and engineering applicability in intelligent fault diagnosis of rotating machinery.
- Research Article
- 10.36622/1729-6501.2025.21.2.011
- Jun 25, 2025
- ВЕСТНИК ВОРОНЕЖСКОГО ГОСУДАРСТВЕННОГО ТЕХНИЧЕСКОГО УНИВЕРСИТЕТА
- Г.В Петрухнова + 2 more
рассматриваются вопросы оптимизации тестов контроля цифровых интегральных схем методом случайного поиска. Тесты направлены на обнаружение неисправностей «константная», «короткое замыкание». Представлен алгоритм, который основывается на генерировании случайных тестовых последовательностей и оценке их эффективности на основе энтропийных критериев. Цифровая схема рассматривается как модель, представленная в виде «черного ящика». Результаты работы «черного ящика» определяются на основе логических уравнений или спецификаций. Рассматривается алгоритм оптимизации длины псевдослучайных тестов контроля цифровых схем, основанный на принципе максимума энтропии и методе случайного поиска с возвратом при неудачном шаге. Один из критериев ранее в подобных задачах не использовался и представляет научную новизну. В данной задаче целевая функция зависит от входных вероятностей и является случайной функцией, явный вид которой неизвестен. Так как непосредственное вычисление значений этой функции и ее производной невозможно, то для решения задачи оптимизации необходимо использовать численные методы нулевого порядка, где используется наблюдение за значениями целевой функции. Процесс оптимизации осуществляется путем подбора весов входных сигналов, определяющих частоту подачи логической единицы на каждый вход схемы и анализа соответствующих выходных реакций. Представленный подход позволяет находить тесты, максимально охватывающие возможные неисправности схемы типа «константная» и «короткое замыкание», тем самым повышая надежность выявления дефектов. Анализ проводился на примере нескольких цифровых схем различной сложности. Содержится описание предложенного алгоритма, анализ его эффективности, сравнительные таблицы результатов оптимизации и тестирования. Алгоритм может быть использован для исследования процессов оптимизации тестов контроля различных цифровых схем we consider the issues of control tests of digital integrated circuits optimization by the random search method. The tests are aimed at detecting "constant" and "short circuit" faults. We present an algorithm that is based on generating random test sequences and evaluating their effectiveness based on entropy criteria. We consider a digital circuit as a model presented in the form of a "black box". We determined the results of the "black box" operation based on logical equations or specifications. The purpose of this article is an algorithm for optimizing the length of pseudorandom digital circuit control tests based on the principle of maximum entropy and a random search method with a return in case of an unsuccessful step. One of the criteria has not been used in such tasks before and represents a scientific novelty. In this problem, the objective function depends on the input probabilities and is a random function whose explicit form is unknown. Since it is impossible to directly calculate the values of this function and its derivative, numerical methods of zero order must be used to solve the optimization problem, which uses observation of the values of the objective function. We carried out an optimization process by selecting the weights of the input signals that determine the frequency of supply of the logical unit to each input of the circuit and analyzing the corresponding output reactions. The presented approach makes it possible to find tests that maximally cover possible circuit failures such as "constant" and "short circuit", thereby increasing the reliability of defect detection. We carried out the analysis using the example of several digital circuits of varying complexity. It contains a description of the proposed algorithm, an analysis of its effectiveness, and comparative tables of optimization and testing results. The algorithm can be used to study the optimization processes of control tests of various digital circuits
- Research Article
- 10.3390/stats8030049
- Jun 20, 2025
- Stats
- Jianping Hao + 1 more
Traditional methods for mission reliability assessment under operational testing conditions exhibit some limitations. They include coarse modeling granularity, significant parameter estimation biases, and inadequate adaptability for handling heterogeneous test data. To address these challenges, this study establishes an assessment framework using a vehicular missile launching system (VMLS) as a case study. The framework constructs phase-specific reliability block diagrams based on mission profiles and establishes mappings between data types and evaluation models. The framework integrates the maximum entropy criterion with reliability monotonic decreasing constraints, develops a covariate-embedded Bayesian data fusion model, and proposes a multi-path weight adjustment assessment method. Simulation and physical testing demonstrate that compared with conventional methods, the proposed approach shows superior accuracy and precision in parameter estimation. It enables mission reliability assessment under practical operational testing constraints while providing methodological support to overcome the traditional assessment paradigm that overemphasizes performance verification while neglecting operational capability development.
- Research Article
- 10.1109/taes.2025.3529410
- Jun 1, 2025
- IEEE Transactions on Aerospace and Electronic Systems
- Xuxin Wang + 3 more
Robust Kalman Filter and Smoother based on the Student's t Minimum Error Entropy Criterion
- Research Article
- 10.3390/en18071855
- Apr 7, 2025
- Energies
- Bao Wang + 5 more
Short-term load is influenced by multiple external factors and shows strong nonlinearity and volatility, which increases the forecasting difficulty. However, most of existing short-term load forecasting methods rely solely on the original load data or take into account a single external factor, which results in significant forecasting errors. To improve the forecasting accuracy, this paper proposes a short-term load forecasting method considering multiple contributing factors based on VAR and CEEMDAN-CNN- BILSTM. Firstly, multiple contributing factors strongly correlated with the short-term load are selected based on the Spearman correlation analysis, the vector autoregressive (VAR) model with multivariate input is derived, and the Levenberg–Marquardt algorithm is introduced to estimate the model parameters. Secondly, the complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN) algorithm and permutation entropy (PE) criterion are combined to decompose and reconstruct the original load data into multiple relatively stationary mode components, which are respectively input into the CNN-BILTSM network for forecasting. Finally, the sine–cosine and Cauchy mutation sparrow search algorithm (SCSSA) is used to optimize the parameters of the combinative model to improve the forecasting accuracy. The actual simulation results utilizing the Australian data validate the forecasting accuracy of the proposed model, achieving reduction in the root mean square error by 31.21% and 18.04% compared to the VAR and CEEMDAN-CNN-BILSTM, respectively.
- Research Article
- 10.1007/s00467-025-06764-8
- Apr 3, 2025
- Pediatric nephrology (Berlin, Germany)
- Jian-An Wang + 5 more
Intradialytic hypotension (IDH) is associated with mortality in adults undergoing intermittent hemodialysis, but this relationship is unclear in critically ill children receiving continuous kidney replacement therapy (CKRT). We aim to evaluate the relationship between IDH and hospital mortality and if pressure data from dialysis machines could predict IDH. We conducted a retrospective cohort study in a tertiary pediatric intensive care unit and NICU from December 2019 to July 2022, including 23 patients across 38 admissions (median age 10years). IDH proportion was significantly associated with mortality (risk ratio [RR]: 4.40, 95% confidence interval [CI]: 1.22-15.90, p = 0.02). Random Forest models using Entropy or Gini criteria demonstrated high sensitivity. The CatBoost model achieved the highest average F1-score and area under the receiver operating characteristic (ROC) curve (AUC) (88.18% and 86.6% with and without dialysis settings, respectively). Local Interpretable Model-agnostic Explanations (LIME) indicated that dialysis machine-derived time-series pressure parameters were critical predictive features for IDH, whereas blood pressure-related variables were not among the top predictors. Dialysis machine-derived pressure parameters may serve as effective predictive markers for IDH, which is associated with increased mortality. These findings support the potential of integrating pressure data in the early detection and management of IDH in pediatric CKRT patients.
- Research Article
- 10.1109/tcbbio.2025.3531938
- Mar 1, 2025
- IEEE transactions on computational biology and bioinformatics
- Cheng Wang + 6 more
Computational methods for predicting drug-target binding affinity (DTA) are critical for large-scale screening of prospective therapeutic compounds during drug discovery. Deep neural networks (DNNs) have recently shown significant promise for DTA prediction. By leveraging available data for training, DNNs can expand the use of DTA prediction to situations where only sequence information is available for potential drug molecules and their targets, and there is no prior knowledge regarding the molecular geometric conformations. We propose DHAG-DTA, a general dynamic hierarchical affinity graph DNN approach, for DTA prediction using molecular sequence information and already known drug-target interactions. DHAG-DTA introduces a two-level hierarchical graph structure: at the upper level, interactions between drug and target molecules are represented via an affinity graph and at the lower level, embedded molecular graphs represent interactions within the individual molecules. This allows for integration of information from both inter and intra molecular interactions for DTA prediction, which has also been addressed in other recent independent work. The fundamental innovations introduced by DHAG-DTA include: (a) a single overall hierarchical graph that allows better assimilation of information during the learning process compared with loosely-coupled individual graphs, (b) dynamic determination of the affinity graph structure via the introduction of unlabeled edges and a maximum entropy criterion for active edge selection, (c) skip connections in the DNN for fusing intra and inter molecular information, and (d) fusion of both model-based and similarity-based feature embeddings to get robust embeddings of unseen molecules. Experimental results on two common benchmark datasets demonstrate that DHAG-DTA outperforms other existing models on multiple evaluation metrics, achieving state-of-the-art performance.
- Research Article
2
- 10.1016/j.geits.2025.100292
- Mar 1, 2025
- Green Energy and Intelligent Transportation
- Chen Chen + 5 more
State of charge estimation for lithium-ion batteries using an adaptive cubature Kalman filter based on improved generalized minimum error entropy criterion
- Research Article
- 10.1063/5.0244768
- Jan 8, 2025
- Journal of Applied Physics
- Zhenjun Shao + 9 more
The medium- or high-entropy strategy has emerged as a new paradigm for designing high-performance piezoelectric ceramics. However, the effectiveness of this approach remains unclear to the development of high Curie temperature (TC) piezo-/ferroelectric materials with outstanding performance. To develop high-performance piezo-/ferroelectric materials suitable for high-temperature environments, in this work, we design a novel ceramic system based on a medium-entropy morphotropic phase boundary (ME-MPB) strategy. Piezo-/ferroelectric ceramics of the formula, Pb(Yb1/2Nb1/2)O3–Pb(In1/2Nb1/2)O3–PbTiO3, meeting the medium entropy criteria, were successfully synthesized using the conventional solid-state reaction method. The crystal structure, microstructure, dielectric, piezoelectric, and ferroelectric properties of the ceramics of the ME-MPB compositions were systematically investigated. X-ray diffraction and scanning electron microscopy analyses revealed that these ceramics possess a pure perovskite phase and dense microstructure. Notably, the prepared ceramics exhibited exceptional piezoelectric performance, with a high d33 up to 603 pC/N, a large strain of 0.20%, a high remanent polarization of 44.0 μC/cm2, and a high Curie temperature of 362 °C. This study demonstrates an effective design approach based on the ME-MPB strategy and points out a new pathway for developing high-performance materials for high-temperature applications as sensors, thereby expanding the research perspective on the design of medium-entropy piezo-/ferroelectric ceramics.
- Research Article
- 10.1093/jas/skaf221
- Jan 4, 2025
- Journal of Animal Science
- Agnes Nyamiel + 7 more
The ability of ruminants to cope with energy imbalances through alternating body reserves (BR) mobilization and accretion is a key mechanism to improve animals’ resilience and/or robustness. This study aimed to characterize individual variability in BR dynamics using plasma concentrations of key biomarkers in productive ewes. Non-esterified fatty acids, β-hydroxybutyrate, triiodothyronine, insulin, and body condition score traits were monitored longitudinally throughout the productive cycles. The study included primiparous and multiparous ewes, reared under 2 contrasting farming systems (FS), indoor (173 ewes) and outdoor (234 ewes), belonging to 2 cohorts (Coh17/18). We used functional principal component analysis and unsupervised clustering to capture biomarker variation. The optimal number of clusters was selected using the Bayesian information criterion (BIC), integrated complete-data likelihood (ICL), normalized entropy criterion (NEC), and a minimum cluster size of >5% of ewes. The decrease in BR indicated that BR mobilization occurred from mid-pregnancy (P) until weaning (W), regardless of FS or parity (Par). On the contrary, the increase in BR suggested that BR accretion occurred from W until the next P in both FS. Between 2 and 3 distinct trajectories were identified for each biomarker (BIC < 3214.38; ICL < 3237.82; NEC < 6.450), depending on Par and/or FS. Most trajectories were characterized by transient increases in biomarker concentrations during the BR mobilization period, followed by declines that sometimes continued into the BR accretion phase. Such trajectories differed mainly in biomarker concentrations at different stages and/or the time point when peaks were observed. Greater individual variability in biomarker trajectories was particularly observed around lambing. Maintaining similar trajectories across cycles in major clusters for 53% to 100% of ewes suggested that biomarker trajectories might be repeatable. In addition to individual variability, BR levels, their temporal changes, and litter size contributed to the distribution of ewes across clusters for each of the traits, with low (P ≤ 0.05), moderate (P < 0.01), and high (P < 0.001) levels of significance. These findings highlight the potential of plasma biomarkers for characterizing individual variability in BR variations in ruminants reared in different FS conditions.