Classification and clustering via dictionary learning with structured incoherence and shared features
Top Cited Papers
- 1 June 2010
- conference paper
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- p. 3501-3508
- https://doi.org/10.1109/cvpr.2010.5539964
Abstract
A clustering framework within the sparse modeling and dictionary learning setting is introduced in this work. Instead of searching for the set of centroid that best fit the data, as in k-means type of approaches that model the data as distributions around discrete points, we optimize for a set of dictionaries, one for each cluster, for which the signals are best reconstructed in a sparse coding manner. Thereby, we are modeling the data as a union of learned low dimensional subspaces, and data points associated to subspaces spanned by just a few atoms of the same learned dictionary are clustered together. An incoherence promoting term encourages dictionaries associated to different classes to be as independent as possible, while still allowing for different classes to share features. This term directly acts on the dictionaries, thereby being applicable both in the supervised and unsupervised settings. Using learned dictionaries for classification and clustering makes this method robust and well suited to handle large datasets. The proposed framework uses a novel measurement for the quality of the sparse representation, inspired by the robustness of the ℓ 1 regularization term in sparse coding. In the case of unsupervised classification and/or clustering, a new initialization based on combining sparse coding with spectral clustering is proposed. This initialization clusters the dictionary atoms, and therefore is based on solving a low dimensional eigen-decomposition problem, being applicable to large datasets. We first illustrate the proposed framework with examples on standard image and speech datasets in the supervised classification setting, obtaining results comparable to the state-of-the-art with this simple approach. We then present experiments for fully unsupervised clustering on extended standard datasets and texture images, obtaining excellent performance.Keywords
This publication has 27 references indexed in Scilit:
- Sparse Representation for Computer Vision and Pattern RecognitionProceedings of the IEEE, 2010
- Online dictionary learning for sparse codingPublished by Association for Computing Machinery (ACM) ,2009
- Separation of a subspace-sparse signal: Algorithms and conditionsPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2009
- A Novel Subspace Clustering Method for Dictionary DesignLecture Notes in Computer Science, 2009
- Discriminative learned dictionaries for local image analysisPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2008
- A tutorial on spectral clusteringStatistics and Computing, 2007
- Sharing Visual Features for Multiclass and Multiview Object DetectionIEEE Transactions on Pattern Analysis and Machine Intelligence, 2007
- Vector Quantizing Feature Space with a Regular LatticePublished by Institute of Electrical and Electronics Engineers (IEEE) ,2007
- Beyond Bags of Features: Spatial Pyramid Matching for Recognizing Natural Scene CategoriesPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2006
- Nearest q-Flat to m PointsJournal of Optimization Theory and Applications, 2000