Access the full text.
Sign up today, get DeepDyve free for 14 days.
Roberto Palacios, E. Vidal (2006)
Learning weighted metrics to minimize nearest-neighbor classification errorIEEE Transactions on Pattern Analysis and Machine Intelligence, 28
S. Arya, D. Mount, O. Narayan (1995)
Accounting for boundary effects in nearest neighbor searching
(2013)
UCI Machine Learning Repository
A. Atiya (2005)
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and BeyondIEEE Transactions on Neural Networks, 16
(1982)
The Fractal Theory of Nature, New York: W
M. Jiřina, M. Jiřina (2013)
Utilization of singularity exponent in nearest neighbor based classifierJournal of Classification, 30
W. Zuo, Kuanquan Wang, Hongzhi Zhang, D. Zhang (2009)
Kernel Difference-Weighted k-Nearest Neighbors Classification
F. Camastra (2003)
Data dimensionality estimation methods: a surveyPattern Recognit., 36
(2013)
Kernel Bandwidth Estimation for Nonparametric Data Segmentation”, go to: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1
M. Jiřina, M. Jiřina (2008)
Correlation Integral Decomposition for Classification
M. Jiřina, M. Jiřina (2009)
Classification by the Use of Decomposition of Correlation Integral
Bernard Silverman (1987)
Density Estimation for Statistics and Data Analysis
B. Schmuland (2003)
Random Harmonic SeriesThe American Mathematical Monthly, 110
V. Maslov (2005)
On a General Theorem of Set Theory Leading to the Gibbs, Bose-Einstein, and Pareto Distributions as well as to the Zipf-Mandelbrot Law for the Stock MarketMathematical Notes, 78
P. Grassberger, I. Procaccia (1983)
Measuring the Strangeness of Strange AttractorsPhysica D: Nonlinear Phenomena, 9
G. Zipf (1999)
The Psycho-Biology Of Language: AN INTRODUCTION TO DYNAMIC PHILOLOGY
(2008)
Algoval: Algorithm Evaluation over the Web
T. Cover, P. Hart (1967)
Nearest neighbor pattern classificationIEEE Trans. Inf. Theory, 13
D. Scott (1992)
Multivariate Density Estimation, Theory, Practice and VisualizationThe Statistician, 43
(2008)
“ CPW : Class and Prototype Weights Learning ”
F. Camastra, A. Vinciarelli (2001)
Intrinsic Dimension Estimation of Data: An Approach Based on Grassberger–Procaccia's AlgorithmNeural Processing Letters, 14
Christopher Williams (2003)
Learning Kernel ClassifiersJournal of the American Statistical Association, 98
(2005)
Hadronic Tau’s Identification Using Artificial Neural Network
(2008)
Apparatus for Assessing a Control Value
We propose to use the Zipfian distribution as a kernel for the design of a nonparametric classifier in contrast to the Gaussian distribution used in most kernel methods. We show that the Zipfian distribution takes into account multifractal nature of data and gives a true picture of scaling properties inherent in data. We also show that this new look at data structure can lead to a simple classifier that can, for some tasks, outperform more complex systems.
Journal of Classification – Springer Journals
Published: Apr 12, 2015
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.