Real-time body tracking using a gaussian process latent variable model

Shaobo Hou, Aphrodite Galata, Fabrice Caillette, Neil Thacker, Paul Bromiley

    Research output: Chapter in Book/Report/Conference proceedingConference contribution


    In this paper, we present a tracking framework for capturing articulated human motions in real-time, without the need for attaching markers onto the subject's body. This is achieved by first obtaining a low dimensional representation of the training motion data, using a nonlinear dimensionality reduction technique called back-constrained GPLVM. A prior dynamics model is then learnt from this low dimensional representation by partitioning the motion sequences into elementary movements using an unsupervised EM clustering algorithm. The temporal dependencies between these elementary movements are efficiently captured by a Variable Length Markov Model. The learnt dynamics model is used to bias the propagation of candidate pose feature vectors in the low dimensional space. By combining this with an efficient volumetric reconstruction algorithm, our framework can quickly evaluate each candidate pose against image evidence captured from multiple views. We present results that show our system can accurately track complex structured activities such as ballet dancing in real-time. ©2007 IEEE.
    Original languageEnglish
    Title of host publicationProceedings of the IEEE International Conference on Computer Vision|Proc IEEE Int Conf Comput Vision
    Publication statusPublished - 2007
    Event2007 IEEE 11th International Conference on Computer Vision, ICCV - Rio de Janeiro
    Duration: 1 Jul 2007 → …


    Conference2007 IEEE 11th International Conference on Computer Vision, ICCV
    CityRio de Janeiro
    Period1/07/07 → …


    • Computer Science, Artificial Intelligence
    • Engineering, Electrical &
    • Electronic
    • Imaging Science & Photographic Technology


    Dive into the research topics of 'Real-time body tracking using a gaussian process latent variable model'. Together they form a unique fingerprint.

    Cite this