Abstract
With the development of information technology the data to be processed become more and more complex that human can hardly understand the inner structure of the data by direct-viewing cognition. Locally Linear Embedding algorithm can achieve data dimensionality reduction by finding the local linear low-dimensional manifold hidden in the high-dimensional space. However, the LLE algorithm is sensitive to noise. Moreover, the stability of the algorithm is very poor when exposed to strong noise. In this paper, a sparse constraint improvement method is proposed, by adding L1 norm punitive constraint to the reconstruction error function, the optimal reconstruction weight matrix can be sparser. Firstly, transform the reconstruction error function which has been added sparse constraint into a general quadratic programming problem by regularization, and then use interior-point iteration method to search the optimal solution quickly. Simulation experiments of dimensionality reduction of the typical high-dimensional data sets show that the dimensionality reduction results of LLE algorithm in sparse constraint are significantly better than the classic LLE algorithm under the influence of different noise, and have a stronger ability in resisting noise.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Computational and Theoretical Nanoscience
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.