Unsupervised Video Summarization With Cycle-Consistent Adversarial LSTM Networks

Abstract
Video summarization is an important technique to browse, manage and retrieve a large amount of videos efficiently. The main objective of video summarization is to minimize the information loss when selecting a subset of video frames from the original video, hence the summary video can faithfully represent the overall story of the original video. Recently developed unsupervised video summarization approaches are free of requiring tedious annotation on important frames to train a video summarization model and thus are practically attractive. However, their performance is still limited due to the difficulty of minimizing information loss between the summary and original videos. In this paper, we address unsupervised video summarization by developing a novel Cycle-consistent Adversarial LSTM architecture to effectively reduce the information loss in the summary video. The proposed model, named Cycle-SUM, consists of a frame selector and a cycle-consistent learning based evaluator. The selector is a bi-directional LSTM network to capture the long-range relationship between video frames. To overcome the difficulty of specifying a suitable information preserving metric between original video and summary video, the evaluator is introduced to "supervise" selector to improve the video summarization quality. Specifically, the evaluator is composed of two generative adversarial networks (GANs), in which the forward GAN component is learned to reconstruct the original video from summary video, while the backward GAN learns to invert the process. We establish the relation between mutual information maximization and such cycle learning procedure and further introduce cycle-consistent loss to regularize the summarization. Extensive experiments on three video summarization benchmark datasets demonstrate a state-of-the-art performance, and show the superiority of the Cycle-SUM model compared with other unsupervised approaches.
Funding Information
  • National Natural Science Foundation of China (61872122, 61502131)
  • Zhejiang Provincial Natural Science Foundation of China (LY18F020015)

This publication has 29 references indexed in Scilit: