Articles published on Information Entropy
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
8073 Search results
Sort by Recency
- New
- Research Article
- 10.1088/1361-6501/ae2cb5
- Jan 7, 2026
- Measurement Science and Technology
- Dingshen Zhang + 4 more
Abstract With the development of three-dimensional laser scanning technology, high-density point cloud data provides a reliable database, and there is also a large amount of redundant information, which increases the storage and calculation burden of data processing. A point cloud simplification method that maintains the integrity of the geometric structure while compressing data is urgently needed. In this paper, we propose a density-aware sampling strategy following the construction of a grid structure and employ a density histogram to dynamically adjust the number of locally sampled points, thereby achieving both global and local density uniformity. To preserve the overall geometric characteristics of the original point cloud, a curvature-based weighting factor is incorporated into the local farthest point sampling (LFPS) framework, with the point of maximum curvature selected as the initial sampling point to guide the extraction of key features. To ensure geometric continuity in the simplified results, the point cloud surface is reconstructed using the moving least squares (MLS) method, followed by uniform sampling on the fitted surface. The outcomes of the two sampling strategies are then integrated to generate the final simplification point cloud. The experimental results show that compared with existing methods such as AIVS, GF-Sim, GP-PCS, FPS, and Curvature-Based, the proposed method has better fidelity and robustness in terms of running time, information entropy, geometric error spacing and error, and reconstruction quality on multiple public point cloud data sets, which is suitable for largescale point cloud data processing and high-quality modeling tasks.
- New
- Research Article
- 10.58962/2708-4809.siuty.2026.05
- Jan 5, 2026
- Spiritual and intellectual upbringing and teaching of youth in the XXI century
- A V Fain + 1 more
The article is devoted to a comprehensive theoretical and methodological analysis of empathy and reflection as key metacompetencies within the system of spiritual and intellectual education for students. It is substantiated that, in the context of accelerated sociocultural dynamics, informational entropy, and increasing psycho-emotional load, these phenomena acquire the status of fundamental regulators of personal maturity, the capacity for moral autonomy, and constructive social interaction. The article explores interdisciplinary approaches to understanding empathy and reflection, encompassing classical philosophical and psychological concepts (C. Rogers, L.S. Vygotsky) and contemporary neurobiological arguments, specifically the functional significance of mirror neurons for affective resonance and the role of the prefrontal cortex in cognitive regulation and moral judgments. The synergistic interaction of empathy and reflection in the formation of Spiritual Intelligence (SQ) (D. Zohar, I. Marshall) is analyzed in detail. SQ serves as an integrative basis for value orientation and existential resilience. It is argued that empathy ensures decentration and sensitivity to the Other, while reflection creates an internal anchor for self-reflection, meaning-making, and behavioral correction, which is critical for the transition to the post-conventional level of morality (L. Kohlberg) and the prevention of professional burnout. The criteria and levels of formation for empathic-reflective competencies are operationalized, covering the cognitive (accuracy of emotion identification, Theory of Mind), affective (emotional regulation), and conative (prosocial activity, behavior correction) dimensions. The necessity of creating a psycho-emotionally safe educational environment is highlighted as a key psycho-pedagogical condition, especially for students facing social vulnerability. The empirical part of the research was conducted in a teaching and rehabilitation institution on a sample of 50 students, of which 80% (40 individuals) had a disability, and 20% (10 individuals) had the status of an orphan or were deprived of parental care. Diagnostic tools included an emotion recognition test, structured reflective diaries, and the emotional intelligence test by N. Gole. A compensatory mechanism of competency development was identified: students with disabilities showed higher indicators of affective empathy, while orphans demonstrated a higher level of reflection and sensitivity to moral dilemmas, which is interpreted as the formation of an internal support system. The methodological toolkit for development is separately systematized and justified, including empathic communication training, moral case analysis, structured reflective sessions, project activities, and Mindfulness practices. The effectiveness of these methods was confirmed by the dynamics of growth in cognitive decentration and ethical autonomy among students. It is concluded that the seamless integration of these competencies into the educational process is a strategic prerequisite for preparing internally integral, socially responsible specialists and their successful professional realization. Future research perspectives involve developing longitudinal programs for Spiritual Intelligence development and validated diagnostic methodologies.
- New
- Research Article
- 10.1016/j.ins.2025.122713
- Jan 1, 2026
- Information Sciences
- Ze Yang + 6 more
Complex network evolution with node strategies driven by information entropy
- New
- Research Article
- 10.5267/j.ijiec.2025.12.006
- Jan 1, 2026
- International Journal of Industrial Engineering Computations
- Cenyu Hu + 1 more
In modern information warfare, the assessment of ammunition lethality has evolved from single-dimensional evaluations of hit accuracy to multidimensional, multiphase analyses of damage effectiveness. However, exorbitant-tech munition testing is hindered by exorbitant costs, limited sample sizes, and significant uncertainty, rendering traditional binomial or multinomial probability models inadequate. These conventional models either oversimplify damage states (compromising accuracy) or introduce prohibitive computational complexity (impeding practical application). To address these limitations, this paper proposes a Bayesian multi-stage binomial modeling approach for multi-level damage assessment under small-sample conditions. The multinomial representation of discrete damage categories is decomposed into a series of conditional binomial distributions aligned with progressive thresholds (“mild or above”,“moderate or above”, “severe or above”, and “complete destruction”), thereby enables low-dimensional modeling without sacrificing damage granularity, significantly enhancing computational tractability. To construct robust prior distributions, physical simulation results and expert domain knowledge are fused using Dempster–Shafer (D-S) evidence theory. The reliability of this fused information is further validated via a consistency test that integrates the Riemannian manifold of Fisher information and quantum entanglement entropy—mitigating subjectivity biases inherent in expert judgments Leveraging conjugate prior properties and Gibbs sampling within the Markov Chain Monte Carlo (MCMC) framework, the posterior distribution of each damage level is obtained with exorbitant precision despite limited data availability. Comparative experiments demonstrate that the proposed method achieves superior convergence stability, estimation accuracy, and computational efficiency over conventional binomial and multinomial approaches, provides a more comprehensive and precise tool for evaluating ammunition damage effectiveness, with direct implications for operational decision-making in information warfare.
- New
- Research Article
- 10.7498/aps.75.20251306
- Jan 1, 2026
- Acta Physica Sinica
- Song Run + 3 more
Recent advances in crosstalk simulation using integer-order memristive synapses have shown considerable progress. However, most existing models still employ a single-memristor structure, which constrains synaptic weight modulation and makes it difficult to represent both excitatory and inhibitory synaptic connections in a unified manner. These models also often fail to capture the memory effects and nonlocal dynamic properties inherent in biological neurons. To address these issues, this study introduces a fractional-order memristive bridge synapse model for crosstalk coupling. By combining Hindmarsh–Rose (HR) and FitzHugh–Nagumo (FN) neurons, we construct an 8D heterogeneous coupled neural network based on fractional calculus—designated as the Fractional-Order Memristive Bridge Crosstalk-Coupled Neural Network (FMBCCNN). A major innovation is the incorporation of a fractional-order memristive bridge structure that mimics synaptic connections in a bridge configuration. This design provides both historical memory characteristics and bidirectional synaptic weight regulation, overcoming limitations of traditional coupling forms.<br>Using dynamical analysis tools such as phase portraits, bifurcation diagrams, and Lyapunov exponents, we systematically investigate how synaptic and crosstalk strengths influence system behavior under conventional fractional-order conditions. The results reveal diverse dynamical behaviors, including attractor coexistence, forward and reverse period-doubling bifurcations, and chaotic crises. Further analysis under the more generalized condition of non-uniform fractional orders shows that, compared with the conventional case, the system maintains continuous periodic motion over broader parameter ranges and exhibits clear parameter hysteresis. Although local dynamic patterns remain similar, the corresponding parameter intervals are substantially widened. In addition, the system displays more concentrated and marked alternation between periodic and chaotic behaviors. We also simulate the effect of varying the fractional-order derivative, offering a more general mathematical characterization of neuronal firing activity.<br>Finally, the chaotic sequences generated by the system are applied to an image encryption algorithm incorporating bit-plane decomposition and DNA encoding. Security analysis confirms that the encrypted images have pixel correlation coefficients below 0.01 in horizontal, vertical, and diagonal directions, information entropy greater than 7.999, and a key space of 2<sup>2080</sup>. These results verify the excellent encryption performance and reliability of the proposed scheme and the generated sequences.
- New
- Research Article
- 10.1016/j.neunet.2025.108061
- Jan 1, 2026
- Neural networks : the official journal of the International Neural Network Society
- Deheng Zeng + 3 more
Data-free knowledge distillation via text-noise fusion and dynamic adversarial temperature.
- New
- Research Article
- 10.1016/j.solener.2025.114123
- Jan 1, 2026
- Solar Energy
- Junjie Lin + 3 more
Optimal PMU configuration method for distribution networks based on information entropy gain
- New
- Research Article
- 10.1109/tpami.2025.3609956
- Jan 1, 2026
- IEEE transactions on pattern analysis and machine intelligence
- Wenhao Mao + 5 more
Offering rich contexts to Large Language Models (LLMs) has shown to boost the performance in various tasks, but the resulting longer prompt would increase the computational cost and might exceed the input limit of LLMs. Recently, some prompt compression methods have been suggested to shorten the length of prompts by using language models to generate shorter prompts or by developing computational models to select important parts of original prompt. The generative compression methods would suffer from issues like hallucination, while the selective compression methods have not involved linguistic rules and overlook the global structure of prompt. To this end, we propose a novel selective compression method called PartPrompt. It first obtains a parse tree for each sentence based on linguistic rules, and calculates local information entropy for each node in a parse tree. These local parse trees are then organized into a global tree according to the hierarchical structure such as the dependency of sentences, paragraphs, and sections. After that, the root-ward propagation and leaf-ward propagation are proposed to adjust node values over the global tree. Finally, a recursive algorithm is developed to prune the global tree based on the adjusted node values. The experiments show that PartPrompt receives the state-of-the-art performance across various datasets, metrics, compression ratios, and target LLMs for inference. The in-depth ablation studies confirm the effectiveness of designs in PartPrompt, and other additional experiments also demonstrate its superiority in terms of the coherence of compressed prompts and in the extreme long prompt scenario.
- New
- Research Article
- 10.1016/j.neucom.2025.131904
- Jan 1, 2026
- Neurocomputing
- Yajing Wang + 2 more
DeinfoAttack: A heuristic graph adversarial attack algorithm leveraging graph topological information entropy
- New
- Research Article
- 10.1016/j.oceaneng.2025.123764
- Jan 1, 2026
- Ocean Engineering
- Lan Qi + 2 more
Feature extraction and classification of underwater targets based on multi-feature multi-scale information entropy, CEEMDAN and BOA-SVM
- New
- Research Article
- 10.53022/oarjet.2025.9.2.0100
- Dec 31, 2025
- Open Access Research Journal of Engineering and Technology
- Siddhant Hanumant Kale + 2 more
Crop yield prediction (CYP) at the field level plays a crucial role in evaluating agricultural commodities plans for import-export strategies, agricultural production and increasing farmer incomes. Traditional methods, which depend on historical data, weather conditions, and agronomic models, have been greatly improved through the application of machine learning technologies. This paper proposes an innovative M-Let and CNN-based Fusion model for Crop Yield Prediction (MCF-CYP) model. The input data undergoes preprocessing, where a Three-tier Data Normalization (TDN) technique is applied to standardize the values, ensuring all features are on a consistent scale for improved model effectiveness. Next, feature extraction is performed to identify the key characteristics of the data. In this case, features like Beckenstein Hawking with Deng Belief-based Renya Entropy (BHDB-RE) features, information gain, and various statistical measures are extracted to capture essential insights related to crop yield. These features are then passed into the prediction model, which utilizes two deep learning architectures: Multi-head LeNet (M-Let) and Convolutional Neural Networks (CNN). Both models analyze the complex patterns in the data and generate accurate crop yield predictions. Finally, the predicted yield is outputted, providing valuable insights for crop management and other agricultural decisions.
- New
- Research Article
- 10.55220/2576-6821.v9.823
- Dec 29, 2025
- Journal of Banking and Financial Dynamics
- Sheng Wang
Digital finance represents the trend of financial development in the 21st century, with its own unique basic concepts and theoretical foundations, and is currently a hot research field in the academic community. In the process of constructing the theory of digital finance, a large number of research literature has been used to conduct empirical studies on various topics, with the help of the Digital Inclusive Finance Index developed by the Digital Finance Research Center of Peking University, which is a convenient definition of digital finance. However, the validity of many empirical research results is still worth further investigation. This article has conducted a comprehensive study on this point, defining new concepts for the first time and building a new analytical framework. Through logistic regression model, 1137 empirical results were empirically studied, and it was found that less than 30% of the empirical models met the criteria of low risk and could be adopted. The new achievement of this article is the first definition of the empirical model risk level index, which provides a specific expression of the interval estimation of the model fitting risk level index, namely the calculation formulas for the upper and lower bounds of the interval. The Shannon’s information entropy is applied to supplement the reliability test of the empirical results of the regression model and further conduct variance tests. Therefore, in order to avoid modern financial risks, especially financial operational risks, constructing empirical models strictly according to the testing standards in this article is highly likely to resolve the occurrence of financial operational risks.
- New
- Research Article
- 10.1049/ipr2.70270
- Dec 28, 2025
- IET Image Processing
- Muhammad Hanif + 6 more
ABSTRACT In today's digital era, images play a vital role across diverse fields, including healthcare, banking, defence, traffic monitoring, and weather forecasting. As digital footprints, their use is rapidly growing, but they are also increasingly vulnerable to unauthorised access and misuse. To address this challenge, we propose a novel encryption scheme for multiple red, green, blue (RGB) images. The scheme takes an arbitrary number of images, overlays them to form a three‐dimensional (3D) image, and then divides it into four subparts. A five‐dimensional multi‐wing hyperchaotic map is employed for random selection of parts, images, rows, columns, and key images. Rows and columns from selected images are repeatedly swapped to produce scrambled 3D images, which are further XORed to generate the final encrypted outputs. The encrypted images are then combined into a large 3D RGB cipher image. To enhance security, the scheme integrates a 256‐bit salt key with SHA‐256 hash codes, ensuring strong key space and plaintext sensitivity. Experimental results demonstrate that the proposed approach provides robustness against multiple threats, real‐world applicability, and high security. Notably, the scheme achieved a highly competitive information entropy value of 7.99994, confirming its effectiveness. Extensive experiments further show that the ciphertexts exhibit high randomness and robustness: average correlation coefficients (CCs) between adjacent pixels are close to zero, the number of pixel change rate is 99.63%, and the unified average changing intensity is 33.45%. The decrypted images achieve peak signal‐to‐noise ratio (PSNR) values of ∞ (O–D) and 7.9982 (O–C), confirming lossless reconstruction. Moreover, the scheme demonstrates strong resistance to chosen plaintext as well as noise and cropping attacks, while maintaining competitive computational efficiency. Comparative analysis with recent chaos‐based algorithms verifies that the proposed approach provides superior security, randomness, and robustness for secure image transmission.
- New
- Research Article
- 10.1038/s41598-025-31776-7
- Dec 27, 2025
- Scientific reports
- Lei Wei + 4 more
In the cloud-edge-end communication architecture of the new power system, heterogeneous perception services face a fundamental and long-standing demand-supply mismatch with multi-dimensional resources (computing, storage, spectrum/bandwidth, and power) under QoS constraints such as delay, reliability, and accuracy. To uniformly measure and minimize this mismatch under resource-limited and time-varying network conditions-thereby enabling precise and efficient perception-this paper proposes an intelligent perception-service efficiency evaluation and optimization method for electric power information and communication networks based on fit entropy. First, based on the theory of information entropy, the fit entropy is defined for the degree of matching between the requirements of perception services such as delay and reliability and the provision of resources. Then, based on the fit entropy, a three-layer matching model of business domain- logical domain- physical domain is constructed, and then a many-to-many matching optimization problem between the business, service function chain and physical device is formed. Furthermore, a dynamic hypergraph neural network based on the gated attention mechanism is designed to solve this problem, where the multi-type aware service requests are dynamically mapped to cross-domain hyperedges, and the fit entropy is used as the weight of the hyperedges to quantify the global fit among the three domains. The fit entropy is optimized by adaptively adjusting the hypergraph structure and the weight of the hyperedges. The simulation results show that this method can significantly improve the quality of service of perceptive services and effectively balance the utilization of network resources and service adaptability.
- New
- Research Article
- 10.3390/electronics15010114
- Dec 25, 2025
- Electronics
- Lina Zhang + 2 more
Color-to-grayscale conversion is a fundamental preprocessing task with widespread applications in digital printing, electronic ink displays, medical imaging, and artistic photo stylization. A primary challenge in this domain is to simultaneously preserve global luminance distribution and local contrast. To address this, we propose an adaptive conversion method centered on a novel objective function that integrates information entropy with Edge Content (EC), a metric for local gradient information. The key advantage of our approach is its ability to generate grayscale results that maintain both rich overall contrast and fine-grained local details. Compared with previous adaptive linear methods, our approach demonstrates superior qualitative and quantitative performance. Furthermore, by eliminating the need for computationally expensive edge detection, the proposed algorithm provides an effective solution to the color-to-grayscale conversion.
- New
- Research Article
- 10.11648/j.ajmcm.20251004.13
- Dec 24, 2025
- American Journal of Mathematical and Computer Modelling
- Parthasarathy Srinivasan
One of the most pervasive applications in Computing, is the generation of Random numbers, which belong to a certain probability distribution such as a Gaussian (normal) distribution. These probability distributions possess statistical properties such as expected values (mean), variance (standard deviation), p-value, Entropy etc.; out of which Entropy is significant, for quantifying the amount of (useful) information, that a particular instance of a distribution embodies. This quantification of Entropy is of value as a characterizing metric, which determines the amount of randomness/uncertainty and/or redundancy that can be achieved using a particular distribution instance. This is particularly useful for communication, cryptographic and astronomical applications in this day and age. In the present work the Author introduces an alternate way to calculate the approximate value of the Information Entropy (with a variation to the formulation of Information Entropy by Claude Shannon, as known by the scientific community); by observing that a Takens embedding of the probability distribution yields a simple measure of the Entropy; by taking into consideration only four critical/representative points of the embedding. By comparative experimentation, the Author has been able to empirically verify that this alternate formulation is consistently valid: The baseline experiment chosen relates to Discrete Task Oriented Joint Source Channel Coding (DT-JSCC) which utilizes entropy computation to perform efficient and reliable task oriented communication (transmission and reception) as will be elaborated further. The author performed the comparison by employing the Shannon formulation for Entropy computation in the baseline DT-JSCC experiment and then repeating the experiment by employing the Entropy formulation, introduced in this work. Eventually, the accuracy of results obtained (data models generated) were almost identical (differing in accuracy by only ~ 1% overall). Thus, the alternate formulation introduced in this work, provides a reliable means of validating the random numbers obtained from the Shannon formulation and also potentially serves as a simpler, faster, and more computationally optimal method. This is particularly useful in applications, where there is a constraint on the computational resources available, such as mobile and limited devices. The method is also useful as a way of uniquely identifying and characterizing Random probability sources, such as those from astronomical and/or optical (photonic) phenomenon. The author also investigates the impact of incorporating the above notion of Entropy into the Mars Rover IER software and confirms the conclusions in the original article from Jet Propulsion Laboratories, NASA, which describes the ICER Progressive Wavelet Image Compressor.
- New
- Research Article
- 10.1088/2752-5724/ae3098
- Dec 23, 2025
- Materials Futures
- Zi-Yuan Cheng + 7 more
Abstract Single-element amorphous metals are ideal model systems for understanding glass formation, yet their creation remains extremely challenging due to strong crystallization tendencies. Here, we introduce a previously unexplored bottom-up route to synthesize amorphous silver nanostructures at ambient conditions by using a DNA origami template with near-fivefold symmetry. The polyanionic DNA scaffold selectively concentrates Ag+ ions, which are subsequently reduced in situ to trigger localized nucleation. Importantly, the pentagonal geometry imposes strong spatial confinement and geometric frustration, providing an effective and experimentally validated mechanism for suppressing crystallization in a monatomic metal. Transmission electron microscopy confirms the formation of stable amorphous Ag domains near the symmetry center, while molecular dynamics simulations show that fivefold symmetry increases the information entropy and inhibits long-range order. This work demonstrates the first DNA-templated strategy for producing monometallic amorphous nanostructures and provides direct evidence that geometric symmetry can be used to induce amorphization in crystallization-prone metals. The resulting platform offers a powerful model system for probing amorphization mechanisms and expands the design space for structural control in nanoscale metals.
- New
- Research Article
- 10.1007/s42452-025-08115-6
- Dec 22, 2025
- Discover Applied Sciences
- Chun Fu + 2 more
Abstract The incorporation of waste rubber in concrete is a beneficial solution, it not only can improve the frost-resistance of concrete, reduce environmental pollution, and it also has a role that cannot be ignored in reducing carbon emissions. Benefit from the relative dynamic elastic modulus, the frost-resistance of concrete can be effectively evaluated. Hence, predicting the frost resistance accurately of rubber concrete is urgently needed for its popularization and application in cold areas. In this paper, 153 sets of relative dynamic elastic modulus data of rubber concrete were collected from published literatures, by combining information entropy and back propagation neural network (BPNN), a two-stage hybrid BPNN (5-7-1) prediction model was proposed. In this model, information entropy was used for input features selection, and BPNN was used for predicting frost resistance. The number of neurons in the hidden layer of BPNN was comprehensively determined by empirical formula and the minimum mean square error of the model. The prediction results show that after features selection, the MSE (Mean Square Error), RMSE (Root Mean Square Error), MAE (Mean absolute error), MAPE (average absolute percentage error) and R 2 (determination coefficient) are all superior to the BPNN without feature selection. The results of this research provide a new idea for the frost-resistance prediction for rubberized concrete.
- New
- Research Article
- 10.1038/s41598-025-33051-1
- Dec 19, 2025
- Scientific reports
- Xintao Zhou + 3 more
Accurate identification of weak fault signals is critical for gear fault detection, yet particularly challenging. This study proposes a gear fault diagnosis method that utilizes Mutual Information (MI) and an Improved False Nearest Neighbor (IFNN) algorithm to optimize the delay time (τ) and embedding dimension (m) for Multiscale Permutation Entropy (MPE) calculation. The MPE values of various fault samples are computed using this optimized approach. The minimum Mahalanobis distance (min-MDMaha) for each sample achieves a fault identification accuracy of 76.87%. Information entropy is then employed to extract useful information from different fault samples, serving as weights for the MDMaha. Experiments on gear pitting and wear faults validate the method. The weighted MDMaha significantly improves accuracy to 99.72%. The results demonstrate the superior effectiveness of the proposed weighted MDMaha-enhanced MPE framework in characterizing vibration signatures induced by gear faults.
- New
- Research Article
- 10.3390/math14010005
- Dec 19, 2025
- Mathematics
- Jianhua Qiu + 7 more
In communication environments with limited computing resources, securely and efficiently transmitting image data has become a challenging problem. However, most existing image data protection schemes are based on high-dimensional chaotic systems as key generators, which suffer from issues such as high algorithmic complexity and large computational overhead. To address this, this paper presents new designs for a 1D Sine Fractional Chaotic Map (1D-SFCM) as a random sequence generator and provides mathematical proofs related to the boundedness and fixed points of this model. Furthermore, this paper improves the traditional 2D compressive sensing (2DCS) algorithm by using the newly designed 1D-SFCM map to generate a chaotic measurement matrix, which can effectively enhance the quality of image recovery and reconstruction. Moreover, referring to the principle of gene mutation in biogenetics, this paper designs an image encryption algorithm based on DNA base substitution. Finally, the security of the proposed encryption scheme and the quality of image compression and reconstruction are verified through indicators such as key space, information entropy, and Number of Pixel Change Rate (NPCR).