Articles published on Functional decomposition
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
2990 Search results
Sort by Recency
- New
- Research Article
- 10.3390/universe12010027
- Jan 19, 2026
- Universe
- Li Han + 4 more
With the rapid expansion of pulsar survey data driven by advanced radio telescopes such as FAST, automated detection methods have become crucial for the efficient and accurate identification of single-pulse signals. A key challenge in this task is the extreme class imbalance between genuine pulsar pulses and radio frequency interference (RFI), which significantly hampers classifier performance—particularly in low signal-to-noise ratio (S/N) environments. To address this issue and improve detection accuracy, we propose Pulsar-WRecon, a Wasserstein GAN with Gradient Penalty (WGAN-GP)-based framework designed to generate realistic single-pulse profiles. The synthetic samples generated by Pulsar-WRecon are used to augment training data and alleviate class imbalance. Building upon the enhanced dataset, Convolutional Kolmogorov–Arnold Network (CKAN) is further introduced as a novel hybrid model that integrates convolutional layers with KAN-based functional decomposition to better capture complex patterns in pulse signals. On the three-channel pulsar images from the HTRU1 dataset, our method achieves a recall of 97.5% and a precision of 98.5%. On the DM time series image dataset, FAST-DATASET, it achieves a recall of 93.2% and a precision of 92.5%. These results validate that combining generative data augmentation with an improved model architecture can effectively enhance the precision of single-pulse detection in large-scale pulsar surveys, especially in challenging, real-world conditions.
- New
- Research Article
- 10.1088/1361-6560/ae37c2
- Jan 13, 2026
- Physics in medicine and biology
- Shengzi Zhao + 3 more
X-ray diffraction (XRD) is a non-destructive technique capable of obtaining molecular structural information of materials. It has great potential in medical and security applications, such as rapid breast cancer screening, calculi composition analysis, and detection of drugs and explosives. Among various X-ray diffraction tomography (XRDT) systems, snapshot coded aperture XRDT (SCA-XRDT) achieves the fastest scanning speed, making it well-suited for practical medical imaging and security inspection. However, SCA-XRDT suffers from poor data condition and an ill-posed reconstruction problem, leading to significant challenges in accurate image reconstruction. In this work, we explore the inherent characteristics of XRD patterns and incorporate a novel and effective prior accordingly into an iterative reconstruction algorithm, thereby improving the reconstruction performance.
Approach: By analyzing the key physical factors that shape XRD patterns, we represent XRD patterns as a linear combination of basis functions. Building upon this, we propose a novel basis-function-decomposition reconstruction (BFD-Recon) method that incorporates the basis function representation as a prior into a model-based SCA-XRDT reconstruction framework. This method transforms the optimization target from entire XRD patterns to parameters of basis functions. We further impose smoothness and sparsity constraints on the parameters to restrict the solution space. We employ the Split Bregman algorithm to iteratively solve the optimization problem. Both simulation and experimental results demonstrate the effectiveness of the proposed BFD-Recon method.
Main-Results: Compared with a conventional MBIR method, the proposed BFD-Recon method results in more accurate reconstruction of XRD patterns, especially the sharp peaks that closely match the ground truth. It substantially suppresses the noise and the impact of background signals on the reconstructed XRD patterns. Since the proposed basis function decomposition and the prior align well with the characteristics of XRD patterns, its value is well manifested along the spectral dimension of the reconstructed images. Quantitatively, BFD-Recon increases the correlation coefficients between the reconstructed and ground-truth XRD patterns by up to 10% and the average PSNR by 20%.
Significance: Through theoretical analysis and experiments, we propose a basis function decomposition method for XRD patterns and demonstrate its effectiveness and general applicability. Incorporating the basis-function-decomposition into the model-based iterative reconstruction can significantly enhance the XRDT reconstruction performance. The method provides prior information on XRD patterns and reduces the number of unknowns by at least one order of magnitude by transforming the optimization target to basis function parameters, which effectively alleviates the ill-posedness of the reconstruction problem.

.
- New
- Research Article
- 10.1016/j.dsp.2025.105564
- Jan 1, 2026
- Digital Signal Processing
- Rui Xue + 1 more
A novel unambiguous acquisition algorithm based on decomposition and reconstruction of sub-correlation functions for semi-integer CPM signals
- Research Article
- 10.1021/acs.jctc.5c01686
- Dec 23, 2025
- Journal of chemical theory and computation
- Zui Tao + 2 more
We introduce the Wannier function decomposition of excitons (WFDX) method to quantify exciton localization in solids within the ab initio Bethe-Salpeter equation framework. By decomposing each Bloch exciton wave function into products of single-particle electron and hole maximally localized Wannier functions, this real-space approach provides well-defined orbital- and spatial-resolved measures of both Frenkel and charge-transfer excitons at low computational cost. We apply WFDX to excitons in acene crystals, quantifying how the number of rings, the exciton spin state, and the center-of-mass momentum affect spatial localization. Additionally, we show how this real-space representation reflects structural nonsymmorphic symmetries that are hidden in standard reciprocal-space descriptions. We demonstrate how the WFDX framework can be used to efficiently interpolate exciton expansion coefficients in reciprocal-space and outline how it may facilitate evaluation of observables involving position operators, highlighting its potential as a general tool for both analyzing and computing excitonic properties in solids.
- Research Article
- 10.34123/icdsos.v2025i1.727
- Dec 22, 2025
- Proceedings of The International Conference on Data Science and Official Statistics
- Muhammad Zaki Azhari + 2 more
The cryptocurrency market, characterized by high volatility, has evolved into a significant financial asset class, attracting both retail and institutional investors. Understanding its interconnectedness with macroeconomic factors is crucial for risk management and financial stability. This study empirically analyzes the dynamic relationships between two primary crypto assets, Bitcoin (BTC) and Ethereum (ETH), and the monetary policy shifts of the U.S. Federal Reserve (The Fed). Using a Vector Autoregression (VAR) model on daily time-series data from January 1, 2022, to June 16, 2025, this research investigates the short-term dynamics, Granger causality, and shock transmissions within this system. The findings reveal a significant one-way causal relationship from The Fed's interest rate changes to both Bitcoin and Ethereum returns, challenging the weak-form Efficient Market Hypothesis. Furthermore, Impulse Response Function (IRF) and Forecast Error Variance Decomposition (FEVD) analyses provide robust evidence of Bitcoin's market leadership, with shocks in Bitcoin explaining nearly 70% of the variance in Ethereum's movements. These results highlight a clear hierarchical structure: The Fed influences broad market sentiment, while Bitcoin leads internal market dynamics, offering critical insights for investors and policymakers navigating the digital asset ecosystem.
- Research Article
- 10.1146/annurev-statistics-042324-040052
- Dec 22, 2025
- Annual Review of Statistics and Its Application
- Xiuyuan Cheng + 2 more
Spatiotemporal point processes model discrete events distributed in space and time, with applications in criminology, seismology, epidemiology, and social networks. Classical models rely on parametric kernels, limiting their ability to capture heterogeneous, nonstationary dynamics. Recent advances integrate deep neural architectures, either by modeling the conditional intensity directly or by learning flexible, data-driven influence kernels. This article reviews the deep influence kernel approach, which balances statistical interpretability by retaining explicit kernels to capture event propagation, with expressive power from neural architectures. We outline key components, including functional basis decomposition, graph neural networks for encoding spatial or network structures, and both likelihood-based and likelihood-free estimation methods, while addressing scalability for large data. We also highlight theoretical results on kernel identifiability. Applications in crime analysis, earthquake aftershock prediction, and sepsis modeling demonstrate the framework's effectiveness. We conclude with promising directions for developing explainable and scalable deep kernel point processes.
- Research Article
- 10.1002/joc.70236
- Dec 19, 2025
- International Journal of Climatology
- Zengyuan Guo + 3 more
ABSTRACT Accurate seasonal forecasting of wind energy resources is critical for optimising renewable energy integration. This study assesses the prediction skill of winter 10 m wind speed anomalies over East China using the Copernicus C3S Multi‐Model Ensemble forecasts, revealing that ECMWF and UK Met Office (UKMO) models demonstrate superior performance. We conduct comprehensive analyses of spatial anomaly correlations, temporal correlations, and empirical orthogonal function (EOF) decompositions by dividing the study area into northern (Zone 1), central (Zone 2) and southern (Zone 3) subregions. Results indicate strong forecast skill in Zone 1 and Zone 3, attributable to their tight correlation with well‐resolved large‐scale circulation patterns, whereas Zone 2 exhibited limited skill due to complex local influences. A hybrid dynamical‐statistical model is used to reconstruct EOF1, which is then combined with the original model outputs of EOF2 and EOF3. The result shows a substantial increase in temporal correlation coefficient in Zone 2. These findings establish that dynamical‐statistical integration effectively enhances regional wind predictions, offering actionable insights for grid operators and energy planners seeking to mitigate renewable integration challenges in evolving climate regimes.
- Research Article
- 10.3389/frai.2025.1706566
- Dec 19, 2025
- Frontiers in Artificial Intelligence
- Johan Pena-Campos + 5 more
Black-box models, particularly Support Vector Machines (SVM), are widely employed for identifying dynamic systems due to their high predictive accuracy; however, their inherent lack of transparency hinders the understanding of how individual input variables contribute to the system output. Consequently, retrieving interpretability from these complex models has become a critical challenge in the control and identification community. This paper proposes a post-hoc functional decomposition algorithm based on Non-linear Oblique Subspace Projections (NObSP). The method decomposes the output of an already identified SVM regression model into a sum of partial (non)linear dynamic contributions associated with each input regressor. By operating in the non-linear feature space, NObSP utilizes oblique projections to mitigate cross-contributions from correlated regressors. Furthermore, an efficient out-of-sample extension is introduced to improve scalability. Numerical simulations performed on benchmark Wiener and Hammerstein structures demonstrate that the proposed method effectively retrieves the underlying partial nonlinear dynamics of each sub-system. Additionally, the computational analysis confirms that the proposed extension reduces the arithmetic complexity from 𝒪(N3) to 𝒪(Nd2), where d is the number of support vectors. These findings indicate that NObSP is a robust geometric framework for interpreting non-linear dynamic models, offering a scalable solution that successfully decouples blended dynamics without sacrificing the predictive power of the black-box model.
- Research Article
- 10.3390/jmse13122386
- Dec 16, 2025
- Journal of Marine Science and Engineering
- Sarat Chandra Mohapatra + 2 more
A theoretical model of the interaction between a following current and a semi-infinite floating ice sheet under compressive stress near a vertical impermeable wall is developed, within the scope of linear water wave theory, to study the hydroelastic behavior. The conceptual framework defining the buoyant ice structure incorporates the tenets of elastic beam theory. The associated fluid dynamics are governed by strict adherence to the potential flow paradigm. To resolve the undetermined parameters appearing in the Fourier series decomposition of the potential functions, investigators systematically apply higher-order criteria detailing the coupling relationships between modes. The current results are compared with a specific case of results available in the literature, and the convergence analysis of the analytical solution is made for computational accuracy. Further, the free edge conditions are applied at the edge of the floating ice sheet, and the effects of current speed, compressive stress, the thickness of the ice sheet, flexural rigidity, water depth on the strain, displacements, reflection wave amplitude, and the horizontal force on the rigid vertical wall are analyzed in detail. It is found that the higher values of the following current heighten the strain, displacements, reflection amplitude, and force on the wall. The study’s outcomes are considered to benefit not just cold region design applications but also the engineering of resilient floating structures for oceanic and offshore environments, and to the design of marine structures.
- Research Article
- 10.12737/szf-114202501
- Dec 10, 2025
- Solnechno-Zemnaya Fizika
- Leonid Kitchatinov
The paper presents a mean-field model for large-scale flows in convection zones of the Sun and solar-type stars. The model extends former differential rotation models by allowance for variations of the flow with time and its deviation from axial symmetry. The model is realized as a numerical code, which combines the spectral method of decomposition in spherical functions with second-order accurate finite-difference method in time and radius. First computations show close agreement of the axially symmetric part of the computed flow with helioseismological detections of differential rotation and meridional circulation. Patterns of the time-decaying non-axisymmetric flow computed with the model qualitatively agree with the Rossby waves observed on the Sun. The paper also formulates a problem for further development of the large-scale flow theory.
- Research Article
- 10.12737/stp-114202501
- Dec 10, 2025
- Solar-Terrestrial Physics
- Leonid Kitchatinov
The paper presents a mean-field model for large-scale flows in convection zones of the Sun and solar-type stars. The model extends former differential rotation models by allowance for variations of the flow with time and its deviation from axial symmetry. The model is realized as a numerical code, which combines the spectral method of decomposition in spherical functions with second-order accurate finite-difference method in time and radius. First computations show close agreement of the axially symmetric part of the computed flow with helioseismological detections of differential rotation and meridional circulation. Patterns of the time-decaying non-axisymmetric flow computed with the model qualitatively agree with the Rossby waves observed on the Sun. The paper also formulates a problem for further development of the large-scale flow theory.
- Research Article
- 10.54254/2977-5701/2025.30303
- Dec 9, 2025
- Journal of Applied Economics and Policy Studies
- Tianyue Wu
This paper mainly examines the impact of the Fed interest rate hike on the yield and volatility of Chinese financial and export industry indexes. In the empirical process, the explanatory variable Fed rate hike, the interpreted variable index yield and macro control variable together form a matrix variable, and its stationarity and cointegration test are performed. Then the Vector Auto Regression (VAR) model is mainly used to determine the direction and significance of the fitting coefficient, and the Impulse Response Function (IRF) and Forecast Error Variance Decomposition (FEVD) will be further used to analyze the direction and persistence of dynamic response, and impact ratio in the future period. In addition, by comparing the GARCH, EGARCH, GJR-GARCH models with the Fed interest rate hike as the exogenous variable X and the parameter significance indicators under the mean equation Auto Regressive Moving Average (ARMA), the more applicable GARCH model for each index is selected and its volatility is analyzed. Finally, the possible reasons for the above analysis results are explained.
- Research Article
- 10.1063/5.0285412
- Dec 1, 2025
- AIP Advances
- Zhan Zhang + 3 more
Variational Mode Decomposition (VMD) has been widely used for harmonic detection. However, it is sensitive to noise, requires prior knowledge of the number of decomposition modes K, and suffers from end point effects. To address noise interference, an improved wavelet threshold function is applied to denoise the original signal. For the predetermined K requirement, the property of minimal inter-mode correlation during optimal VMD decomposition is utilized to achieve adaptive K selection. Regarding end point effects, a fixed-window-length waveform matching extension method is implemented to extend the signal, which effectively suppresses end point effects. Upon completion of optimal signal decomposition, both the frequency and amplitude of each harmonic component can be precisely extracted using the Hilbert transform. Simulation results demonstrate that the proposed algorithm achieves three key improvements: effective noise reduction, adaptive selection of the optimal number of decomposition modes K, and successful suppression of end point effects. These characteristics show that the algorithm has high application value in harmonic detection scenarios.
- Research Article
- 10.1101/2025.11.26.690831
- Dec 1, 2025
- bioRxiv
- Norman Scheel + 14 more
The Risk Reduction for Alzheimer’s Disease (rrAD) trial included 513 cognitively normal, sedentary, hypertensive older adults (aged 60 to 85 years) with dementia risk factors. We utilized 420 high-quality baseline resting-state functional MRI (rs-fMRI) scans from this cohort to develop a functional atlas tailored for aging populations. Typical rs-fMRI atlases derived from healthy young adults do not account for age-related changes, such as cortical atrophy, enlarged ventricles, and altered connectivity. To address this gap, we created a cohort-specific MNI-adjacent anatomical template, rrAD420, using SPM12’s DARTEL registration. In this space, we derived a comprehensive functional atlas using both group independent component analysis (GICA) and probabilistic functional mode decomposition (PROFUMO). The rrAD420 atlas offers detailed representations of Resting-State Network (RSN) connectivity, encompassing unique configurations and overlapping interactions. It features two Default-Mode Network (DMN)-specific seed-based maps (DMN24 with cerebellum, DMN18 without) and data-driven components resembling the major RSNs. Furthermore, PROFUMO allowed for the identification of multimodal and combinatory networks, capturing connections within and between RSNs. While optimized for hypertensive older adults, the rrAD420 atlas serves as a versatile tool for broader aging populations, aiding in the study of neurodegenerative processes and biomarker discovery.
- Research Article
- 10.1016/j.soilbio.2025.110067
- Dec 1, 2025
- Soil Biology and Biochemistry
- Ramesha H Jayaramaiah + 8 more
Plant functional groups shape microbial colonization and decomposition dynamics in grassland soils
- Research Article
- 10.3390/jmse13122261
- Nov 27, 2025
- Journal of Marine Science and Engineering
- Jingju Wang + 7 more
The evaporation duct, formed above the ocean surface by sharp vertical gradients of humidity, would significantly influence electromagnetic wave propagation. It is a quasi-permanent feature over the sea, and its strength is quantified by the evaporation duct height (EDH). While previous studies have focused on how local factors influence evaporation ducts, the impact of El Niño–Southern Oscillation (ENSO) on EDH in the South China Sea (SCS) remains undocumented. Using correlation analysis, empirical orthogonal function (EOF) decomposition, and wavelet transform, this study shows that evaporation is the dominant environmental factor controlling EDH variability across seasonal and inter-annual timescales in the SCS, while wind speed and relative humidity play secondary roles with contrasting effects between the northern and southern regions. ENSO drives the inter-annual variability of EDH by modulating evaporation. During El Niño events, anomalous anticyclonic circulations near the Philippine Sea, which weaken (strengthen) the evaporation in the northern (southern) SCS, alter EDH and contribute to the formation of the meridional dipole structure, particularly within the 2-to-6-year ENSO band. These results provide new insights into the mechanisms controlling EDH in the SCS and highlight the critical role of ENSO in shaping its spatial distribution.
- Research Article
- 10.3390/systems13121054
- Nov 23, 2025
- Systems
- Xiaoliang Xie + 5 more
As global warming intensifies, extreme weather phenomena such as heatwaves, flash droughts, torrential floods, cold waves, and blizzards are becoming increasingly frequent. Against this backdrop, traditional static food security assessment methods fail to capture the dynamic transmission patterns of agricultural productivity risks and their regional heterogeneity. Therefore, it is imperative to reconstruct a resilience analysis paradigm for food production systems, dynamically investigate the mechanisms through which climate change affects China’s agricultural productivity and discern the interactive effects between technological evolution and climate constraints. This will provide theoretical foundations for building a climate-resilient food security system. Accordingly, this study establishes a multidimensional resilience measurement index system for China’s grain productivity by integrating agricultural factor elasticity analysis with disaster impact response modeling. Through production function decomposition and hybrid forecasting models, we reveal the evolutionary patterns of China’s grain productivity under climate risk shocks and trace the transmission pathways of risk fluctuations. Key findings indicate the following: (1) Extreme climate events exhibit significant negative correlations with grain production, with drought and flood impacts demonstrating pronounced regional heterogeneity. (2) A dynamic game relationship exists between agricultural technological progress and climate risk constraints, where the marginal contribution of resource efficiency improvements to productivity growth shows diminishing returns. (3) Climate-sensitive factors vary substantially across agricultural zones: Northeast China faces dominant cold damage, North China experiences drought stress, while South China contends with humid-heat disasters as primary regional risks. Consequently, strengthening foundational agricultural infrastructure and optimizing regionally differentiated risk mitigation strategies constitute critical pathways for enhancing food security resilience. (4) Future research should leverage higher-resolution, county-level data and incorporate a wider range of socio-economic variables to enhance granular understanding and predictive accuracy.
- Research Article
- 10.3390/electronics14224500
- Nov 18, 2025
- Electronics
- Rahul Razdan + 3 more
Autonomy is enabled by the close connection of traditional mechanical systems with information technology. Historically, both communities have built norms for validation and verification (V&V), but with very different properties for safety and associated legal liability. Thus, combining the two in the context of autonomy has exposed unresolved challenges for V&V, and without a clear V&V structure, demonstrating safety is very difficult. Today, both traditional mechanical safety and information technology rely heavily on process-oriented mechanisms to demonstrate safety. In contrast, a third community, the semiconductor industry, has achieved remarkable success by inserting design artifacts which enable formally defined mathematical abstractions. These abstractions combined with associated software tooling (Electronics Design Automation) provide critical properties for scaling the V&V task, and effectively make an inductive argument for system correctness from well-defined component compositions. This article reviews the current methods in the mechanical and IT spaces, the current limitations of cyber-physical V&V, identifies open research questions, and proposes three directions for progress inspired by semiconductors: (i) guardian-based safety architectures, (ii) functional decompositions that preserve physical constraints, and (iii) abstraction mechanisms that enable scalable virtual testing. These perspectives highlight how principles from semiconductor V&V can inform a more rigorous and scalable safety framework for autonomous systems.
- Research Article
- 10.20527/epsilon.v19i2.15392
- Nov 7, 2025
- EPSILON: JURNAL MATEMATIKA MURNI DAN TERAPAN
- Habill Putra Sangnandha + 1 more
Coffee production in South Sumatra Province plays an important role in Indonesia’s economy, both as a source of foreign exchange and as a livelihood for farmers. During the 2020–2023 period, coffee production exhibited fluctuations that reflected instability and were suspected to be influenced by environmental factors such as rainfall and land area. This study aims to analyze the influence of rainfall and land area on coffee production using the Vector Error Correction Model (VECM), which is capable of examining both short-term and long-term relationships among cointegrated time series variables. The data used consist of monthly records of coffee production, land area, and rainfall obtained from the Central Bureau of Statistics for the 2020–2023 period. The analysis was conducted through a series of statistical tests, including stationarity testing, cointegration, determination of optimal lag, VECM estimation, Granger causality test, as well as impulse response function (IRF) and variance decomposition (VD). The results reveal the existence of a long-term relationship among the variables, where rainfall significantly affects coffee production, while land area does not show a meaningful effect. The VD analysis also emphasizes that rainfall’s contribution to production variation increases up to 10% in the long term, while the applied model is validated through the Portmanteau test. These findings confirm that climatic factors, particularly rainfall, play an essential role in maintaining the stability and sustainability of coffee production in South Sumatra.
- Research Article
1
- 10.1115/1.4070328
- Nov 6, 2025
- Journal of Mechanical Design
- Soheyl Massoudi + 1 more
Abstract Early-stage engineering design involves complex, iterative reasoning, yet existing large language model (LLM) workflows struggle to maintain task continuity and generate executable models. We evaluate whether a structured multi-agent system (MAS) can more effectively manage requirements extraction, functional decomposition, and simulator code generation than a simpler two-agent system (2AS). The target application is a solar-powered water filtration system as described in a cahier des charges. We introduce the Design-State Graph (DSG), a JSON-serializable representation that bundles requirements, physical embodiments, and Python-based physics models into graph nodes. A nine-role MAS iteratively builds and refines the DSG, while the 2AS collapses the process to a Generator-Reflector loop. Both systems run a total of 60 experiments (2 LLMs - Llama 3.3 70B vs reasoning-distilled DeepSeek R1 70B x 2 agent configurations x 3 temperatures x 5 seeds). We report a JSON validity, requirement coverage, embodiment presence, code compatibility, workflow completion, runtime, and graph size. Across all runs, both MAS and 2AS maintained perfect JSON integrity and embodiment tagging. Requirement coverage remained minimal (less than 20%). Code compatibility peaked at 100% under specific 2AS settings but averaged below 50% for MAS. Only the reasoning-distilled model reliably flagged workflow completion. Powered by DeepSeek R1 70B, the MAS generated more granular DSGs (average 5-6 nodes) whereas 2AS mode-collapsed. Structured multi-agent orchestration enhanced design detail. Reasoning-distilled LLM improved completion rates, yet low requirements and fidelity gaps in coding persisted.