A Scalable Kernel-Based Semisupervised Metric Learning Algorithm with Out-of-Sample Generalization Ability
Open Access
- 1 November 2008
- journal article
- Published by MIT Press in Neural Computation
- Vol. 20 (11), 2839-2861
- https://doi.org/10.1162/neco.2008.05-07-528
Abstract
In recent years, metric learning in the semisupervised setting has aroused a lot of research interest. One type of semisupervised metric learning utilizes supervisory information in the form of pairwise similarity or dissimilarity constraints. However, most methods proposed so far are either limited to linear metric learning or unable to scale well with the data set size. In this letter, we propose a nonlinear metric learning method based on the kernel approach. By applying low-rank approximation to the kernel matrix, our method can handle significantly larger data sets. Moreover, our low-rank approximation scheme can naturally lead to out-of-sample generalization. Experiments performed on both artificial and real-world data show very promising results.Keywords
This publication has 6 references indexed in Scilit:
- Generalized Low Rank Approximations of MatricesMachine Learning, 2005
- Leveraging social networks to fight spamComputer, 2005
- Nonlinear Dimensionality Reduction by Locally Linear EmbeddingScience, 2000
- Nonlinear Component Analysis as a Kernel Eigenvalue ProblemNeural Computation, 1998
- Objective Criteria for the Evaluation of Clustering MethodsJournal of the American Statistical Association, 1971
- Objective Criteria for the Evaluation of Clustering MethodsJournal of the American Statistical Association, 1971