Information Theoretic Mean Shift Algorithm

Abstract
In this paper we introduce a new cost function called information theoretic mean shift algorithm to capture the "predominant structure" in the data. We formulate this problem with a cost function which minimizes the entropy of the data subject to the constraint that the Cauchy-Schwartz distance between the new and the original dataset is fixed to some constant value. We show that Gaussian mean shift and the Gaussian blurring mean shift are special cases of this generalized algorithm giving a whole new perspective to the idea of mean shift. Further this algorithm can also be used to capture the principal curve of the data making it ubiquitous for manifold learning.

This publication has 6 references indexed in Scilit: