An Approximate Bayesian Approach to Optimal Input Signal Design for System Identification

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

The design of informatively rich input signals is essential for accurate system identification, yet classical Fisher-information-based methods are inherently local and often inadequate in the presence of significant model uncertainty and non-linearity. This paper develops a Bayesian approach that uses the mutual information (MI) between observations and parameters as the utility function. To address the computational intractability of the MI, we maximize a tractable MI lower bound. The method is then applied to the design of an input signal for the identification of quasi-linear stochastic dynamical systems. Evaluating the MI lower bound requires the inversion of large covariance matrices whose dimensions scale with the number of data points N. To overcome this problem, an algorithm that reduces the dimension of the matrices to be inverted by a factor of N is developed, making the approach feasible for long experiments. The proposed Bayesian method is compared with the average D-optimal design method, a semi-Bayesian approach, and its advantages are demonstrated. The effectiveness of the proposed method is further illustrated through four examples, including atomic sensor models, where input signals that generate a large amount of MI are especially important for reducing the estimation error.

Similar Papers
  • PDF Download Icon
  • Research Article
  • Cite Count Icon 2
  • 10.1109/access.2020.3011421
Joint Optimization Method of Airborne MIMO Radar Track and Radiated Power Based on Mutual Information
  • Jan 1, 2020
  • IEEE Access
  • Xiangyu Fan + 4 more

The mutual information (MI) is used as the target function to jointly optimize the space trajectory and radiated power of the airborne MIMO radar. By adjusting the space position of the radar and radiating energy in real time, the detection efficiency of the MIMO radar is improved. Firstly, the cooperative detection model of aviation swarm MIMO radar is constructed to quantitatively describe the relationship between radar position and radiated power parameters and echo. Therefore, the MI between the transmitted signal and the received signal at the same time is derived, and the MI of the radar echo at the current time and the next time is derived. Maximizing the amount of MI sent and received signals can improve the amount of information detected, and minimizing the amount of MI at adjacent moments can improve the quality of information. This paper designs a time-sharing optimization algorithm, and improves the Artificial Bee Colony algorithm (ABC) to optimize the above two MI to achieve real-time adjustment of radar position and power. Through simulation verification and algorithm comparison, the advantages of this algorithm are reflected.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 5
  • 10.3389/fninf.2013.00053
Mutual information spectrum for selection of event-related spatial components. Application to eloquent motor cortex mapping
  • Jan 20, 2014
  • Frontiers in Neuroinformatics
  • Alexei Ossadtchi + 4 more

Spatial component analysis is often used to explore multidimensional time series data whose sources cannot be measured directly. Several methods may be used to decompose the data into a set of spatial components with temporal loadings. Component selection is of crucial importance, and should be supported by objective criteria. In some applications, the use of a well defined component selection criterion may provide for automation of the analysis. In this paper we describe a novel approach for ranking of spatial components calculated from the EEG or MEG data recorded within evoked response paradigm. Our method is called Mutual Information (MI) Spectrum and is based on gauging the amount of MI of spatial component temporal loadings with a synthetically created reference signal. We also describe the appropriate randomization based statistical assessment scheme that can be used for selection of components with statistically significant amount of MI. Using simulated data with realistic trial to trial variations and SNR corresponding to the real recordings we demonstrate the superior performance characteristics of the described MI based measure as compared to a more conventionally used power driven gauge. We also demonstrate the application of the MI Spectrum for the selection of task-related independent components from real MEG data. We show that the MI spectrum allows to identify task-related components reliably in a consistent fashion, yielding stable results even from a small number of trials. We conclude that the proposed method fits naturally the information driven nature of ICA and can be used for routine and automatic ranking of independent components calculated from the functional neuroimaging data collected within event-related paradigms.

  • Research Article
  • 10.3182/20120913-4-it-4027.00034
Optimal Input Design for Dynamic Model Prediction Accuracy
  • Jan 1, 2012
  • IFAC Proceedings Volumes
  • Ke Fang + 1 more

Optimal Input Design for Dynamic Model Prediction Accuracy

  • Research Article
  • Cite Count Icon 45
  • 10.1016/j.neucom.2014.03.006
A methodology for training set instance selection using mutual information in time series prediction
  • Apr 8, 2014
  • Neurocomputing
  • Miloš B Stojanović + 3 more

A methodology for training set instance selection using mutual information in time series prediction

  • Research Article
  • Cite Count Icon 36
  • 10.1016/j.visres.2007.04.004
Mutual information of image fragments predicts categorization in humans: Electrophysiological and behavioral evidence
  • May 17, 2007
  • Vision Research
  • Assaf Harel + 3 more

Mutual information of image fragments predicts categorization in humans: Electrophysiological and behavioral evidence

  • Research Article
  • Cite Count Icon 47
  • 10.1088/0957-0233/26/7/074001
Particle image pattern mutual information and uncertainty estimation for particle image velocimetry
  • Jun 5, 2015
  • Measurement Science and Technology
  • Zhenyu Xue + 2 more

In this work we introduce a new measure for particle image velocimetry (PIV) cross-correlation quality and establish analytically its connection to the basic PIV theory. This metric, which we term ‘mutual information’ (MI), can be used to estimate the number of correlated particles and connect to the PIV measurement uncertainty quantification. In PIV the number of particles in common between two consecutive frames forms the basis of the cross-correlation operation that yields the velocity measurement. Since the particle image pattern intensity distribution within each image represents the available signal, the inherent number of common particle pairs between the cross-correlated images, which can be thought of as the amount of mutual information, governs the potential accuracy of the PIV measurement. The number of common particle pairs between the images can be expressed by the product of the image density NI, and the fraction of particles that leave the frame due to in-plane and out-of-plane motion FI and FO, respectively. It has previously been shown that this parameter, NIFIFO, directly relates to the validity of a PIV measurement. However, in real experiments, NIFIFO is unknown and difficult to calculate. Here we propose to overcome this limitation by introducing a new metric (MI), which directly computes the apparent amount of common information contained in the particle patterns of two consecutive images without prior knowledge of the particle field. Both theoretical derivation and experimental results are provided to show that MI and NIFIFO represent the same characteristics of a PIV measurement. Subsequently, MI is used to develop a model for PIV uncertainty estimation. This metric and the corresponding uncertainty model presented herein are applied to both standard and a filtered phase-only (robust phase correlation) correlation methods. These advancements lead to robust uncertainty estimation models, which are tested against both synthetic benchmark data as well as real experimental measurements. For all cases considered here, and uncertainties demonstrated coverage factors approximately equal to the theoretically expected values of 68.5% and 95%, which reflect 1σ and 2σ levels in a normal distribution model respectively.

  • Research Article
  • Cite Count Icon 2
  • 10.1097/wnr.0000000000001829
A neural basis of rational inattention models: consistency of cognitive cost with the mutual information criterion.
  • Sep 28, 2022
  • NeuroReport
  • Qi Wu + 3 more

The rational inattention model has recently attracted much attention as a promising candidate to model bounded rationality in the research field of decision-making and game theory. However, in contrast to this energetic promotion of the theoretical works, empirical verification of the validity of the RI model has not progressed much. Furthermore, to our knowledge, the central assumption of the RI model, that the amount of mutual information obtained from signals adequately represents the cognitive cost of information, has not been tested from a neuroscientific perspective. The purpose of the present study was to test whether the amount of mutual information adequately represents the cognitive cost of information from a neuroscientific perspective. We proposed a sequential investment task, in which the two main models of RI can be treated simultaneously in a more realistic experimental environment. We used a model-fitting approach to analyze the subjective information cost, and compared the model parameters representing the information cost with the concentration of oxidized hemoglobin in the brain blood. Our results showed that the cost parameter λ of the stochastic choice type model, which fits the behavioral data of the present experiment better than the Kalman filter type model, was significantly positively correlated with the activation status of the rostral prefrontal cortex and dorsolateral prefrontal cortex. The cognitive cost represented by the amount of mutual information employed in the RI model is consistent with the activation of brain regions associated with cognitive cost, and, thus, indirectly supports the assumption of the RI model.

  • Conference Article
  • Cite Count Icon 4
  • 10.1109/med.2007.4433794
Identification of dynamical systems under multiple operating conditions via functionally pooled ARMAX models
  • Jun 1, 2007
  • J.S Sakellariou + 1 more

In a companion paper [1], a novel framework for the identification of stochastic dynamical systems under multiple operating conditions, with each condition characterized by a measurable variable, is introduced and used for the identification of postulated functionally pooled autoregressive with exogenous input (FP-ARX) models. The present paper focuses on the use of this framework for the identification of FP-ARMAX models, which additionally incorporate moving average (MA) part. FP-ARMAX models are conceptual extensions of their conventional ARMAX counterparts, with the important difference that their parameters and innovations variance are functions of the measurable variable and that they account for cross-correlations among the operating conditions. Yet, FP-ARMAX model identification is more complicated, and is presently achieved via prediction error and maximum likelihood type methods. The asymptotic properties of the prediction error estimator are established, and the estimators' performance characteristics are assessed via a Monte Carlo study.

  • Research Article
  • Cite Count Icon 18
  • 10.1186/s12868-015-0168-0
Mutual information against correlations in binary communication channels
  • May 19, 2015
  • BMC Neuroscience
  • Agnieszka Pregowska + 2 more

BackgroundExplaining how the brain processing is so fast remains an open problem (van Hemmen JL, Sejnowski T., 2004). Thus, the analysis of neural transmission (Shannon CE, Weaver W., 1963) processes basically focuses on searching for effective encoding and decoding schemes. According to the Shannon fundamental theorem, mutual information plays a crucial role in characterizing the efficiency of communication channels. It is well known that this efficiency is determined by the channel capacity that is already the maximal mutual information between input and output signals. On the other hand, intuitively speaking, when input and output signals are more correlated, the transmission should be more efficient. A natural question arises about the relation between mutual information and correlation. We analyze the relation between these quantities using the binary representation of signals, which is the most common approach taken in studying neuronal processes of the brain.ResultsWe present binary communication channels for which mutual information and correlation coefficients behave differently both quantitatively and qualitatively. Despite this difference in behavior, we show that the noncorrelation of binary signals implies their independence, in contrast to the case for general types of signals.ConclusionsOur research shows that the mutual information cannot be replaced by sheer correlations. Our results indicate that neuronal encoding has more complicated nature which cannot be captured by straightforward correlations between input and output signals once the mutual information takes into account the structure and patterns of the signals.

  • Research Article
  • Cite Count Icon 98
  • 10.1103/physreve.77.011901
Mutual information in random Boolean models of regulatory networks
  • Jan 3, 2008
  • Physical Review E
  • Andre S Ribeiro + 4 more

The amount of mutual information contained in the time series of two elements gives a measure of how well their activities are coordinated. In a large, complex network of interacting elements, such as a genetic regulatory network within a cell, the average of the mutual information over all pairs, <I>, is a global measure of how well the system can coordinate its internal dynamics. We study this average pairwise mutual information in random Boolean networks (RBNs) as a function of the distribution of Boolean rules implemented at each element, assuming that the links in the network are randomly placed. Efficient numerical methods for calculating <I> show that as the number of network nodes, N, approaches infinity, the quantity N<I> exhibits a discontinuity at parameter values corresponding to critical RBNs. For finite systems it peaks near the critical value, but slightly in the disordered regime for typical parameter variations. The source of high values of N<I> is the indirect correlations between pairs of elements from different long chains with a common starting point. The contribution from pairs that are directly linked approaches zero for critical networks and peaks deep in the disordered regime.

  • Research Article
  • Cite Count Icon 11
  • 10.1016/j.neuroscience.2020.11.031
Enhanced Discriminative Abilities of Auditory Cortex Neurons for Pup Calls Despite Reduced Evoked Responses in C57BL/6 Mother Mice
  • Nov 28, 2020
  • Neuroscience
  • Juliette Royer + 4 more

Enhanced Discriminative Abilities of Auditory Cortex Neurons for Pup Calls Despite Reduced Evoked Responses in C57BL/6 Mother Mice

  • Research Article
  • Cite Count Icon 8
  • 10.5075/epfl-thesis-2344
Modeling diversity by strange attractors with application to temporal pattern recognition
  • Jan 1, 2001
  • O De Feo

Modeling diversity by strange attractors with application to temporal pattern recognition

  • Research Article
  • 10.1103/physreve.110.064152
Path-integral approach to mutual information calculation for nonlinear channel with small dispersion at large signal-to-noise ratio.
  • Dec 30, 2024
  • Physical review. E
  • A V Reznichenko + 1 more

We consider the information fiber optical channel modeled by the nonlinear Schrodinger equationwith additive Gaussian noise. Using path-integral approach and perturbation theory for the small dimensionless parameter of the second dispersion, we calculate the conditional probability density functional in the leading and next-to-leading order in the dimensionless second dispersion parameter associated with the input signal bandwidth. Taking into account the specific filtering of the output signal by the output signal receiver, we calculate the mutual information in the leading and next-to-leading order in the dispersion parameter and at large signal-to-noise ratio (SNR). Further, we find the explicit analytical expression for the mutual information in the case of the modified Gaussian input signal distribution taking into account the limited frequency bandwidth of the input signal. We explain the behavior of the mutual information as a function of the average input signal power and the input signal bandwidth in connection with the frequency broadening in the presence of small dispersion.

  • Conference Article
  • Cite Count Icon 3
  • 10.1109/acc.2000.878746
Using mutual information to pre-process input data for a virtual sensor
  • Jan 1, 2000
  • P.B Deignan + 4 more

Mutual information can be used to determine appropriate input signals and input transformations for a black-box model that represents a virtual sensor. Mutual information is a measure of the strength of the input-output relationship, and can be computed without any assumptions about the linearity of the input-output process. The power of the method is demonstrated with an incompressible flow process. Results using mutual information clearly show that all three input signals are necessary to capture the input-output mapping. In addition, nonlinear transformations, such as raising each input to a power, can be evaluated by checking the corresponding mutual information. Results show that raising two of the inputs to the exponent 1/2 gives the highest mutual information, which is consistent with the governing equation.

  • Conference Article
  • 10.1109/apeie.1998.768984
Active identification of stochastic dynamic systems
  • Jan 1, 1998
  • A.Zh Abdenov

It is necessary to know the covariance matrices of measurement noise and dynamic system noise, state and control matrices in order to estimate the optimal state vector. In this paper, algorithmic aspects of linear dynamic system active identification for optimal solution of the Kalman filter problem are considered. It is proposed to solve the input design task by using an input signal autocorrelation function in the time domain and an input signal spectral density in the frequency domain.

More from: Entropy
  • New
  • Addendum
  • 10.3390/e27111135
Correction to Temperature and Bekenstein–Hawking Entropy of Kiselev Black Hole Surrounded by Quintessence
  • Nov 4, 2025
  • Entropy
  • Cong Wang

  • New
  • Research Article
  • 10.3390/e27111134
Prediction Method for Fault-Induced Frequency Response Characteristics in Wind-Integrated Power Systems Using Wide-Area Measurement Data
  • Nov 2, 2025
  • Entropy
  • Yi Hu + 6 more

  • New
  • Research Article
  • 10.3390/e27111123
Dynamic Modeling and Analysis of Rotary Joints with Coupled Bearing Tilt-Misalignment Faults
  • Oct 31, 2025
  • Entropy
  • Jun Lu + 6 more

  • New
  • Research Article
  • 10.3390/e27111125
Information Entropy of Biometric Data in a Recurrent Neural Network with Low Connectivity
  • Oct 31, 2025
  • Entropy
  • David Dominguez-Carreta + 4 more

  • New
  • Research Article
  • 10.3390/e27111130
Review of the Use of Entropy to Understand the Thermodynamics of Pure-Substance PCMs
  • Oct 31, 2025
  • Entropy
  • Harald Mehling

  • New
  • Research Article
  • 10.3390/e27111127
Kicked General Fractional Lorenz-Type Equations: Exact Solutions and Multi-Dimensional Discrete Maps
  • Oct 31, 2025
  • Entropy
  • Vasily E Tarasov

  • New
  • Research Article
  • 10.3390/e27111121
MVIB-Lip: Multi-View Information Bottleneck for Visual Speech Recognition via Time Series Modeling
  • Oct 31, 2025
  • Entropy
  • Yuzhe Li + 3 more

  • New
  • Research Article
  • 10.3390/e27111131
Effect of Running Speed on Gait Variability in Individuals with Functional Ankle Instability
  • Oct 31, 2025
  • Entropy
  • Wenhui Mao + 6 more

  • New
  • Research Article
  • 10.3390/e27111133
FLACON: An Information-Theoretic Approach to Flag-Aware Contextual Clustering for Large-Scale Document Organization
  • Oct 31, 2025
  • Entropy
  • Sungwook Yoon

  • New
  • Research Article
  • 10.3390/e27111128
Interactions Among Morphology, Word Order, and Syntactic Directionality: Evidence from 55 Languages
  • Oct 31, 2025
  • Entropy
  • Wenchao Li + 1 more

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.

Search IconWhat is the difference between bacteria and viruses?
Open In New Tab Icon
Search IconWhat is the function of the immune system?
Open In New Tab Icon
Search IconCan diabetes be passed down from one generation to the next?
Open In New Tab Icon