Please use this identifier to cite or link to this item:
|Title:||FHC: an adaptive fast hybrid method for k-NN classification|
Dervos, Dimitris A.
|Subjects:||FRASCATI::Natural sciences::Computer and information sciences|
|Source:||Logic Journal of IGPL|
|Abstract:||A popular and easy to implement classifier is the k-Nearest Neighbour (k-NN). However, sequentially searching for nearest neighbours in large datasets leads to inefficient classification because of the high computational cost involved. This article presents an adaptive hybrid and cluster-based method for speeding up the k-NN classifier. The proposed method reduces the computational cost as much as possible while maintaining classification accuracy at high levels. The method is based on the well-known k-means clustering algorithm and consists of two main parts: (i) a pre-processing algorithm that builds a two-level, cluster-based data structure, and (ii) a hybrid classifier that classifies new items by accessing either the first or the second level of the data structure. The proposed approach was tested on seven real life datasets and the experiential measurements were statistically validated by the Wilcoxon signed ranks test. The results show that the proposed classification method can be used either to achieve high accuracy with slightly higher cost or to reduce the cost at a minimum level with slightly lower accuracy.|
|Appears in Collections:||Department of Applied Informatics |
This item is licensed under a Creative Commons License