Abstract

This paper examines evolutionary nonlinear projection (NLP), a form of multidimensional scaling (MDS) performed with an evolutionary algorithm. MDS is a family of techniques for producing a low dimensional data set whose points have a one-to-one correspondence with the points of a higher dimensional data set with the added property that distances or dissimilarities in the higher dimensional space are preserved as much as possible in the lower dimensional space. The goal is typically visualization but may also be clustering or other forms of analysis. In this paper, we review current methods of NLP and go on to characterize NLP as an evolutionary computation problem, gaining insight into MDS as an optimization problem. Two different mutation operators, one introduced in this paper, are compared and parameter studies are performed on mutation rate and population size. The new mutation operator is found to be superior. NLP is found to be a problem where small population sizes exhibit superior performance. It is demonstrated experimentally that NLP is a multimodal optimization problem. Two broad classes of projection problems are identified, one of which yields consistent high-quality results and the other of which has many optima, all of low quality. A number of applications of the technique are presented, including projections of feature vectors for polyominos, of vectors that are members of an error correcting code, of behavioral assessments of a collection of agents, and of features derived from DNA sequences.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call