Drift-Proof Tracking With Deep Reinforcement Learning
- 19 February 2021
- journal article
- research article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Multimedia
- Vol. 24 (15209210), 609-624
- https://doi.org/10.1109/tmm.2021.3056896
Abstract
Object tracking is an essential and challenging sub-domain in the field of computer vision owing to its wide range of applications and complexities of real-life situations. It has been studied extensively over the last decade, leading to the proposal of several tracking frameworks and approaches. Recently, the introduction of reinforcement learning and the Actor-Critic framework has effectively improved the tracking speed of deep learning trackers. However, most existing deep reinforcement learning trackers experience a slight performance degradation mainly owing to the drift issues. Drifts pose a threat to the tracking performance, which may lead to losing the tracked target. Herein, we propose a drift-proof tracker with deep reinforcement learning that aims to improve the tracking performance by counteracting drifts while maintaining its real-time advantage. We utilize a reward function with the Distance-IoU (DIoU) metric to guide the reinforcement learning to alleviate the drifts caused by the trained model. Furthermore, double negative samples (hard negative and drift samples) are constructed in tracking for network initialization, which is followed by calculating the loss by a small error-friendly loss function. Therefore, our tracker can better discriminate between the positive and negative samples and correct the predicted bounding boxes when the drift occurs. Meanwhile, a generative adversarial network is adopted for positive sample augmentation. Extensive experimental results on multiple popular benchmarks show that our algorithm effectively reduces the occurrences of drift and boosts the tracking performance, compared to those of other state-of-the-art trackers.Keywords
Funding Information
- Science and Technology Major Project of Hubei Province (2019AEA170)
- Natural Science Foundation of Hubei Province (2018CFA050)
This publication has 73 references indexed in Scilit:
- Deep Relative TrackingIEEE Transactions on Image Processing, 2017
- Real-Time Correlation Filter Tracking by Efficient Dense Belief Propagation With Structure PreservingIEEE Transactions on Multimedia, 2016
- A Benchmark and Simulator for UAV TrackingPublished by Springer Science and Business Media LLC ,2016
- Learning Multi-domain Convolutional Neural Networks for Visual TrackingPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2016
- Learning Spatially Regularized Correlation Filters for Visual TrackingPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2015
- Encoding Color Information for Visual Tracking: Algorithms and BenchmarkIEEE Transactions on Image Processing, 2015
- MUlti-Store Tracker (MUSTer): A cognitive psychology inspired approach to object trackingPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2015
- ImageNet Large Scale Visual Recognition ChallengeInternational Journal of Computer Vision, 2015
- Object Tracking BenchmarkIEEE Transactions on Pattern Analysis and Machine Intelligence, 2015
- MEEM: Robust Tracking via Multiple Experts Using Entropy MinimizationLecture Notes in Computer Science, 2014