Abstract

Granular computing has the advantage of discovering complex data knowledge, and manifold alignment has proven of great value in a lot of areas of machine learning. We propose a novel algorithm of fuzzy granule manifold alignment (FGMA), where we define some new operations, measurements, and local topology of fuzzy granular vectors in fuzzy granular space. Furthermore, the algorithm is very different from Semi-supervised and Procrustes algorithm because predetermining correspondence is not necessary. A projection is learned that can map instances described by two types of features to a low-dimensional space. Meanwhile, the local topology of the fuzzy granular vector induced by the instance is also preserved and matched within each set in lower dimensional space. This approach makes it possible to directly compare between data instances in different spaces. We convert an alignment problem of data in feature space into fuzzy granular manifold alignment problem of granular space. Specifically, we first define fuzzy granule, fuzzy granular vector, operations, and measurements in fuzzy granular space and gave proofs of theorems and deductions. Next, the local topology around the fuzzy granular vector is introduced and the optimal local topology matching can be achieved by minimizing their Frobenius norm. Finally, two manifolds are connected and the optimal mapping can be calculated to obtain dimensionality reduction of the joint structure. Thus, the corresponding relationship between two data instances can be got. We verified this algorithm in Oxford image and Alzheimer’s disease voice dataset. Theoretical analysis and experiments demonstrate the algorithm proposed is robust and effective.

Highlights

  • The term manifold learning was first proposed by Bregler and Omohundro in 1995 [1], [2]

  • As it is based on the manifold distribution assumption of the dataset, any feature learning method for the purpose of learning the internal rules and structural characteristics of the dataset can be regarded as the category of manifold learning

  • The difference between various manifold learning methods is limited to their different ways of collecting neighborhood information and the structural characteristics of the neighborhood

Read more

Summary

INTRODUCTION

The term manifold learning was first proposed by Bregler and Omohundro in 1995 [1], [2]. Representative algorithms include ISOMAP and Maximum Variance Unfolding (MVU) and Diffusion Maps (DM) This idea based on global geometric structure is simple and easy to understand, and can give an accurate description of the global distribution structure of data. The local method is very different from the global one It uses the perspective of local geometry to assure data points that are close within local neighborhood obtain similar projection positions on the low-dimensional space. This type of method constructs a low-dimensional description by maintaining the change of the distribution structure of the manifold data in the local neighborhood. Representative algorithms involve local tangent space alignment (LTSA), Laplacian eigenmaps (LE), local linear embedding (LLE) etc

RELATED WORK
CONTRIBUTIONS Our contributions are as follows:
THE PROBLEM
THE MAIN ALGORITHM
LOCAL TOPOLOGY OF FUZZY GRANULAR VECTOR
MANIFOLD ALIGNMENT WITHOUT PREDETERMINING CORRESPONDENCE
Findings
EXPERIMENTAL ANALYSIS
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.