Computer Science > Machine Learning
[Submitted on 30 Jan 2022]
Title:Empirical complexity of comparator-based nearest neighbor descent
Download PDFAbstract:A Java parallel streams implementation of theK -nearest neighbor descent algorithm is presented using a natural statistical termination criterion. Input data consist of a setS ofn objects of type V, and a Function<V, Comparator<V>>, which enables anyx∈S to decide which ofy,z∈S∖{x} is more similar tox . Experiments with the Kullback-Leibler divergence Comparator support the prediction that the number of rounds ofK -nearest neighbor updates need not exceed twice the diameter of the undirected version of a random regular out-degreeK digraph onn vertices. Overall complexity wasO(nK2logK(n)) in the class of examples studied. When objects are sampled uniformly from ad -dimensional simplex, accuracy of theK -nearest neighbor approximation is high up tod=20 , but declines in higher dimensions, as theory would predict.
Submission history
From: R W R Darling Ph. D. [view email][v1] Sun, 30 Jan 2022 21:37:53 UTC (167 KB)
Current browse context:
cs.LG
References & Citations
Bibliographic and Citation Tools
Bibliographic Explorer (What is the Explorer?)
Litmaps (What is Litmaps?)
scite Smart Citations (What are Smart Citations?)