Tracking multiple people with recovery from partial and total occlusion

Charay Lerdsudwichai, Mohamed Abdel-Mottaleb, A. Nasser Ansari

Research output: Contribution to journalArticlepeer-review

43 Scopus citations


Robust tracking of multiple people in video sequences is a challenging task. In this paper, we present an algorithm for tracking faces of multiple people even in cases of total occlusion. Faces are detected first; then a model for each person is built. The models are handed over to the tracking module which is based on the mean shift algorithm, where each face is represented by the non-parametric distribution of the colors in the face region. The mean shift tracking algorithm is robust to partial occlusion and rotation, and is computationally efficient, but it does not deal with the problem of total occlusion. Our algorithm overcomes this problem by detecting the occlusion using an occlusion grid, and uses a non-parametric distribution of the color of the occluded person's cloth to distinguish that person after the occlusion ends. Our algorithm uses the speed and the trajectory of each occluded person to predict the locations that should be searched after occlusion ends. It integrates multiple features to handle tracking multiple people in cases of partial and total occlusion. Experiments on a large set of video clips demonstrate the robustness of the algorithm, and its capability to correctly track multiple people even when faces are temporarily occluded by other faces or by other objects in the scene.

Original languageEnglish (US)
Pages (from-to)1059-1070
Number of pages12
JournalPattern Recognition
Issue number7
StatePublished - Jul 1 2005


  • Face tracking
  • Multiple people
  • Occlusion recovery
  • Video

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Signal Processing
  • Electrical and Electronic Engineering


Dive into the research topics of 'Tracking multiple people with recovery from partial and total occlusion'. Together they form a unique fingerprint.

Cite this