Enhanced Rank-Estimated Searching for Dimensional Reliance Based on Nearest Neighbor Search

Silla Srinivasrao, K Sameer

Abstract


The K-NN is a technique in which objects are classified relies upon nearest preparing illustrations which is available in the component query space. The K-NN is the least difficult order technique in information mining. In K-NN objects are classified when there is no data about the conveyance of the information objects is known. In K-NN execution of characterization is rely upon K and it can be dictated by the decision of K and also the separation metric of inquiry. The execution of K-NN arrangement is to a great extent influenced by determination of K which is having a reasonable neighborhood size. It is a key issue for order. This paper proposed an information structure which is for K-NN seek, called as Rank Cover Tree to build the computational cost of K-NN Search. In RCT pruning test includes the examination of objects comparative esteems applicable to query. In Rank Cover Tree each protest can allot a particular request and as indicated by that request query can chose which can be pertinent to the individual query inquiry. It can control the general inquiry execution cost .It gives result to Nonmetric pruning techniques for closeness seek and when high dimensional information is prepared it gives a similar outcome. It returns revises inquiry execution result in required time that depends on an inherent dimensionality of objects of the informational collection. RCT can surpass the execution of techniques including metric pruning and numerous determination tests including separation esteems having numerical imperatives on it.


Keywords


Nearest neighbor search, intrinsic dimensionality, rank-based search.

References


T. Hastie, R. Tibshirani, and J. H. Friedman. The Elements of Statistical Learning: Data Mining, Inference, and Prediction. Springer, 2001.

B. W. Silverman. Density Estimation for Statistics and Data Analysis. Chapman & Hall/CRC, 1986.

J. B. Tenenbaum, V. Silva, and J.C. Langford. A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science, 290(5500):2319–2323, 2000.

S. T. Roweis and L. K. Saul. Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science, 290(5500):2323–2326, December 2000.

A. N. Papadopoulos and Y. Manolopoulos. Nearest Neighbor Search: A Database Perspective. Springer, 2005.

N. Alon, M. Badoiu, E. D. Demaine, M. Farach-Colton, and M. T. Hajiaghayi. Ordinal Em- ˘ beddings of Minimum Relaxation: General Properties, Trees, and Ultrametrics. 2008.

K. Beyer, J. Goldstein, R. Ramakrishnan, and U. Shaft. When Is “Nearest Neighbor” Meaningful? LECTURE NOTES IN COMPUTER SCIENCE, pages 217–235, 1999.

J. M. Hammersley. The Distribution of Distance in a Hypersphere. Annals of Mathematical Statistics, 21:447–452, 1950.

J. H. Freidman, J. L. Bentley, and R. A. Finkel. An Algorithm for Finding Best Matches in Logarithmic Expected Time. ACM Trans. Math. Softw., 3(3):209–226, September 1977.

S. M. Omohundro. Five Balltree Construction Algorithms. Technical Report TR-89-063, International Computer Science Institute, December 1989.


Full Text: PDF [Full Text]

Refbacks

  • There are currently no refbacks.


Copyright © 2013, All rights reserved.| ijseat.com

Creative Commons License
International Journal of Science Engineering and Advance Technology is licensed under a Creative Commons Attribution 3.0 Unported License.Based on a work at IJSEat , Permissions beyond the scope of this license may be available at http://creativecommons.org/licenses/by/3.0/deed.en_GB.