location: Current position: Home >> Scientific Research >> Paper Publications

Occlusion-Aware Fragment-Based Tracking With Spatial-Temporal Consistency

Hits:

Indexed by:期刊论文

Date of Publication:2016-08-01

Journal:IEEE TRANSACTIONS ON IMAGE PROCESSING

Included Journals:SCIE、EI、Scopus

Volume:25

Issue:8

Page Number:3814-3825

ISSN No.:1057-7149

Key Words:Visual tracking; fragment-based appearance model; spatial-temporal consistency; occlusion model

Abstract:In this paper, we present a robust tracking method by exploiting a fragment-based appearance model with consideration of both temporal continuity and discontinuity information. From the perspective of probability theory, the proposed tracking algorithm can be viewed as a two-stage optimization problem. In the first stage, by adopting the estimated occlusion state as a prior, the optimal state of the tracked object can be obtained by solving an optimization problem, where the objective function is designed based on the classification score, occlusion prior, and temporal continuity information. In the second stage, we propose a discriminative occlusion model, which exploits both foreground and background information to detect the possible occlusion, and also models the consistency of occlusion labels among different frames. In addition, a simple yet effective training strategy is introduced during the model training (and updating) process, with which the effects of spatial-temporal consistency are properly weighted. The proposed tracker is evaluated by using the recent benchmark data set, on which the results demonstrate that our tracker performs favorably against other state-of-the-art tracking algorithms.

Pre One:Visual Tracking via Random Walks on Graph Model

Next One:Discriminative Hash Tracking With Group Sparsity