Abstract
We present a novel approach for visual tracking of structured behaviour as observed in human-computer interaction. An automatically acquired variable-length Markov model is used to represent the high-level structure and temporal ordering of gestures. Continuous estimation of hand posture is handled by combining the model with annealed particle filtering. The stochastic simulation updates and automatically switches between different model representations of hand posture that correspond to distinct gestures. The implementation executes in real time and demonstrates significant improvement in robustness over comparable methods. We provide a measurement of user performance when our method is applied to a Fitts' law drag-and-drop task, and an analysis of the effects of latency that it introduces. © 2007 Elsevier Inc. All rights reserved.
Original language | English |
---|---|
Pages (from-to) | 98-115 |
Number of pages | 17 |
Journal | Computer Vision and Image Understanding |
Volume | 108 |
Issue number | 1-2 |
DOIs | |
Publication status | Published - Oct 2007 |
Keywords
- Behaviour modelling
- Fitts' law
- Gesture recognition
- Hand tracking
- Human-computer interaction
- Latency
- Particle filtering
- Real-time
- Variable length Markov models