Access the full text.
Sign up today, get DeepDyve free for 14 days.
AK Jain (1989)
Fundamentals of digital Image Processing
(1989)
Fundamentals of digital Image Processing, Englewood Cliffs NJ
(1996)
Another Approach to Polychotomous Classification
D Lin, P Pantel (2002)
Proceedings of the 19th International Conference on Computational Linguistics
Mu Zhu, A. Ghodsi (2006)
Automatic dimensionality selection from the scree plot via the use of profile likelihoodComput. Stat. Data Anal., 51
J. Maa, D. Pearl, R. Bartoszynski (1996)
Reducing multidimensional two-sample data to one-dimensional interpoint comparisonsAnnals of Statistics, 24
(1985)
A Principal Decomposition of Hotelling's T 2 Statistic
S. Kothari, H. Oh (1993)
Neural Networks for Pattern RecognitionAdv. Comput., 37
Marti Anderson, J. Robinson (2003)
Generalized discriminant analysis based on distancesAustralian & New Zealand Journal of Statistics, 45
Wei-Chien Chang (1983)
On using Principal Components before Separating a Mixture of Two Multivariate Normal DistributionsJournal of The Royal Statistical Society Series C-applied Statistics, 32
M. Freimer, R. Bellman (1961)
Adaptive Control Processes: A Guided TourThe Mathematical Gazette, 46
M. Vlachos (2010)
Dimensionality Reduction
L. Devroye, L. Györfi, G. Lugosi (1996)
A Probabilistic Theory of Pattern Recognition, 31
C. Chatfield, P. Krishnaiah (1987)
Multivariate Analysis VI., 150
J. Schäfer, K. Strimmer (2005)
A Shrinkage Approach to Large-Scale Covariance Matrix Estimation and Implications for Functional GenomicsStatistical Applications in Genetics and Molecular Biology, 4
A. Kshirsagar, S. Kocherlakota, K. Kocherlakota (1990)
Classification procedures using principal component analysis and stepwise discriminant functionCommunications in Statistics-theory and Methods, 19
P. Belhumeur, J. Hespanha, D. Kriegman (1996)
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
W. Dillon, N. Mulani, D. Frederick (1989)
On the Use of Component Scores in the Presence of Group StructureJournal of Consumer Research, 16
Dekang Lin, Patrick Pantel (2002)
Concept Discovery from Text
I. Jolliffe, B. Morgan, Philip Young (1996)
A simulation study of the use of principal components in linear discriminant analysisJournal of Statistical Computation and Simulation, 55
W. Torgerson (1952)
Multidimensional scaling: I. Theory and methodPsychometrika, 17
M. Miller, Carey Priebe, A. Qiu, B. Fischl, Anthony Kolasny, Timothy Brown, Youngser Park, J. Ratnanather, Evelina Busa, J. Jovicich, P. Yu, B. Dickerson, R. Buckner (2009)
Collaborative computational anatomy: An MRI morphometry study of the human brain via diffeomorphic metric mappingHuman Brain Mapping, 30
G. Toussaint (1971)
Comments on 'A modified figure of merit for feature selection in pattern recognition' by Paul, J. E., Jr., et alIEEE Trans. Inf. Theory, 17
G. Toussaint (1971)
Note on optimal selection of independent binary-valued features for pattern recognition (Corresp.)IEEE Trans. Inf. Theory, 17
L. Breiman (2001)
Random ForestsMachine Learning, 45
I. Borg, P. Groenen (1999)
Modern Multidimensional Scaling: Theory and ApplicationsJournal of Educational Measurement, 40
G. Trunk (1979)
A Problem of Dimensionality: A Simple ExampleIEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-1
G. Lindzey, Elliot Aronson (1985)
Theory and method
E. Pekalska, R. Duin (2005)
The Dissimilarity Representation for Pattern Recognition - Foundations and Applications, 64
J. Gower (1966)
Some distance properties of latent root and vector methods used in multivariate analysisBiometrika, 53
T. Hastie, R. Tibshirani (1997)
Classification by Pairwise Coupling
K. Clarke (1993)
Non‐parametric multivariate analyses of changes in community structureAustral Ecology, 18
C. Bishop (1995)
Neural networks for pattern recognition
I. Borg, P. Groenen (2005)
Modern multidimensional scaling: Theory and applications, 2nd ed.
C. Priebe (2001)
Olfactory Classification via Interpoint Distance AnalysisIEEE Trans. Pattern Anal. Mach. Intell., 23
P. Hawkes (1980)
Digital image processingNature, 285
We consider the problem of combining multiple dissimilarity representations via the Cartesian product of their embeddings. For concreteness, we choose the inferential task at hand to be classification. The high dimensionality of this Cartesian product space implies the necessity of dimensionality reduction before training a classifier. We propose a supervised dimensionality reduction method, which utilizes the class label information, to help achieve a favorable combination. The simulation and real data results show that our approach can improve classification accuracy compared to the alternatives of principal components analysis and no dimensionality reduction at all.
Journal of Classification – Springer Journals
Published: Oct 8, 2010
Read and print from thousands of top scholarly journals.
Already have an account? Log in
Bookmark this article. You can see your Bookmarks on your DeepDyve Library.
To save an article, log in first, or sign up for a DeepDyve account if you don’t already have one.
Copy and paste the desired citation format or use the link below to download a file formatted for EndNote
Access the full text.
Sign up today, get DeepDyve free for 14 days.
All DeepDyve websites use cookies to improve your online experience. They were placed on your computer when you launched this website. You can change your cookie settings through your browser.