Abstract
AbstractIn the typical approach to instance-based learning, random data (the training set of patterns) are collected and used to design a decision rule (classifier). One of the most well known such rules is the k-nearest-neighbor decision rule in which an unknown pattern is classified into the majority class among its k nearest neighbors in the training set. In the past fifty years many approaches have been proposed to improve the performance of this rule. More recently geometric methods have been found to be the best. Here we mention a variety of open problems of a computational geometric nature that arize in these methods. To provide some context and motivation for these open problems we briefly describe the methods and list some key references.KeywordsDecision RuleVoronoi DiagramSteiner TreeDelaunay TriangulationDecision BoundaryThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.