Paper
30 October 2009 Fusion of cues for occlusion handling in tracking with particle filters
Xiaobo Chen, Xinting Pan, Wei Wang
Author Affiliations +
Proceedings Volume 7495, MIPPR 2009: Automatic Target Recognition and Image Analysis; 749514 (2009) https://doi.org/10.1117/12.832478
Event: Sixth International Symposium on Multispectral Image Processing and Pattern Recognition, 2009, Yichang, China
Abstract
In this paper, a new approach is presented for tracking object accurately and steadily when the target encountering occlusion in video sequences. First, we use Canny algorithm to extract the edges of the object. The edge pixels are classified as foreground/background for each frame using background subtraction. On the next stage, a set of cues including a motion model, an elliptical shape model, a spatial-color mixture of Gaussians appearance model, and an edge orientation histogram model is fused in a principled manner. All these cues could be modeled by a data likelihood function; Then, a particle filter algorithm is used for tracking and the particles are re-sampled based on the fusion of the cues. Result form simulations and experiments with real video sequences show the effectiveness of our approach for tracking people under occlusion conditions.
© (2009) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Xiaobo Chen, Xinting Pan, and Wei Wang "Fusion of cues for occlusion handling in tracking with particle filters", Proc. SPIE 7495, MIPPR 2009: Automatic Target Recognition and Image Analysis, 749514 (30 October 2009); https://doi.org/10.1117/12.832478
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Motion models

Particle filters

Particles

Detection and tracking algorithms

Video

Video surveillance

Data fusion

Back to Top