Weakly-Shared Deep Transfer Networks for Heterogeneous-Domain Knowledge Propagation

Abstract
In recent years, deep networks have been successfully applied to model image concepts and achieved competitive performance on many data sets. In spite of impressive performance, the conventional deep networks can be subjected to the decayed performance if we have insufficient training examples. This problem becomes extremely severe for deep networks with powerful representation structure, making them prone to over fitting by capturing nonessential or noisy information in a small data set. In this paper, to address this challenge, we will develop a novel deep network structure, capable of transferring labeling information across heterogeneous domains, especially from text domain to image domain. This weakly-shared Deep Transfer Networks (DTNs) can adequately mitigate the problem of insufficient image training data by bringing in rich labels from the text domain. Specifically, we present a novel architecture of DTNs to translate cross-domain information from text to image. To share the labels between two domains, we will build multiple weakly shared layers of features. It allows to represent both shared inter-domain features and domain-specific features, making this structure more flexible and powerful in capturing complex data of different domains jointly than the strongly shared layers. Experiments on real world dataset will show its competitive performance as compared with the other state-of-the-art methods.
Funding Information
  • Program for New Century Excellent Talents in University (NCET-12-0632)
  • 973 Program of China (2014CB347600)
  • Natural Science Fund for Distinguished Young Scholars of Jiangsu Province (BK2012033)
  • National Natural Science Foundation of China (61402228)

This publication has 21 references indexed in Scilit: