A real-time hand tracker using variable-length Markov models of behaviour

Nikolay Stefanov, Aphrodite Galata, Roger Hubbold

    Research output: Contribution to journalArticlepeer-review

    Abstract

    We present a novel approach for visual tracking of structured behaviour as observed in human-computer interaction. An automatically acquired variable-length Markov model is used to represent the high-level structure and temporal ordering of gestures. Continuous estimation of hand posture is handled by combining the model with annealed particle filtering. The stochastic simulation updates and automatically switches between different model representations of hand posture that correspond to distinct gestures. The implementation executes in real time and demonstrates significant improvement in robustness over comparable methods. We provide a measurement of user performance when our method is applied to a Fitts' law drag-and-drop task, and an analysis of the effects of latency that it introduces. © 2007 Elsevier Inc. All rights reserved.
    Original languageEnglish
    Pages (from-to)98-115
    Number of pages17
    JournalComputer Vision and Image Understanding
    Volume108
    Issue number1-2
    DOIs
    Publication statusPublished - Oct 2007

    Keywords

    • Behaviour modelling
    • Fitts' law
    • Gesture recognition
    • Hand tracking
    • Human-computer interaction
    • Latency
    • Particle filtering
    • Real-time
    • Variable length Markov models

    Fingerprint

    Dive into the research topics of 'A real-time hand tracker using variable-length Markov models of behaviour'. Together they form a unique fingerprint.

    Cite this