Abstract

Evaluating the performance of Bayesian classification in a high-dimensional random tensor is a fundamental problem, usually difficult and under-studied. In this work, we consider two Signal to Noise Ratio (SNR)-based binary classification problems of interest. Under the alternative hypothesis, i.e., for a non-zero SNR, the observed signals are either a noisy rank-R tensor admitting a Q-order Canonical Polyadic Decomposition (CPD) with large factors of size , i.e., for , where with converge towards a finite constant or a noisy tensor admitting TucKer Decomposition (TKD) of multilinear -rank with large factors of size , i.e., for , where with converge towards a finite constant. The classification of the random entries (coefficients) of the core tensor in the CPD/TKD is hard to study since the exact derivation of the minimal Bayes’ error probability is mathematically intractable. To circumvent this difficulty, the Chernoff Upper Bound (CUB) for larger SNR and the Fisher information at low SNR are derived and studied, based on information geometry theory. The tightest CUB is reached for the value minimizing the error exponent, denoted by . In general, due to the asymmetry of the s-divergence, the Bhattacharyya Upper Bound (BUB) (that is, the Chernoff Information calculated at ) cannot solve this problem effectively. As a consequence, we rely on a costly numerical optimization strategy to find . However, thanks to powerful random matrix theory tools, a simple analytical expression of is provided with respect to the Signal to Noise Ratio (SNR) in the two schemes considered. This work shows that the BUB is the tightest bound at low SNRs. However, for higher SNRs, the latest property is no longer true.

Highlights

  • State-of-the-Art and Problem StatementEvaluating the performance limit for the “Gaussian information plus noise” binary classification problem is a challenging research topic, see for instance [1,2,3,4,5,6,7]

  • The tightest Chernoff Upper Bound (CUB) is reached for the value minimizing the error exponent, denoted by s?

  • This work shows that the Bhattacharyya Upper Bound (BUB) is the tightest bound at low Signal to Noise Ratio (SNR)

Read more

Summary

State-of-the-Art and Problem Statement

Evaluating the performance limit for the “Gaussian information plus noise” binary classification problem is a challenging research topic, see for instance [1,2,3,4,5,6,7]. Posterior-odds ratio [3] since an exact calculation of the minimal Bayes’ error probability Pe is often intractable [3,8] To circumvent this problem, it is standard to exploit well-known bounds on Pe based on information theory [9,10,11,12,13]. The Chernoff information bound can be tight for a minimal s-divergence over parameter s ∈ (0, 1) This step requires solving numerically an optimization problem [24] and often leads to a complicated and uninformative expression of the optimal value of s. Since the exact derivation of the error probability is intractable, the performance of the classification of the core tensor random entries is hard to evaluate To circumvent this audible difficulty, based on computational information geometry theory, we consider the Chernoff. In the CPD, the core tensor is assumed to be diagonal

Paper Organisation
Preliminary Definitions
The Marchenko-Pastur Distribution
Formulation Based on a SNR-Type Criterion
The Expected Log-likelihood Ratio in Geometry Perspective
Fisher Information
Formulation of the Observation Vector as a Structured Linear Model
The CPD Case
Result
Small SNR Deviation Scenario
SNR1 1
Numerical Illustrations
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.