Real-time human motion analysis and IK-based human figure control

S. Yonemoto, D. Arita, R. Taniguchi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

22 Citations (Scopus)

Abstract

The paper presents real-time human motion analysis based on real-time inverse kinematics. Our purpose is to realize a mechanism of human-machine interaction via human gestures, and, as a first step, we have developed a computer-vision-based human motion analysis system. In general, man-machine "smart" interaction requires a real-time human full-body motion capturing system without special devices or markers. However, since such a vision-based human motion capturing system is essentially unstable and can only acquire partial information because of self-occlusion, we have to introduce a robust pose estimation strategy, or an appropriate human motion synthesis based on motion filtering. To solve this problem, we have developed a method based on inverse kinematics, which can estimate human postures with limited perceptual cues such as positions of a head, hands and feet. We outline a real-time and on-line human motion capture system and demonstrate a simple interaction system based on the motion capture system.

Original languageEnglish
Title of host publicationProceedings - Workshop on Human Motion, HUMO 2000
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages149-154
Number of pages6
ISBN (Electronic)0769509398, 9780769509396
DOIs
Publication statusPublished - 2000
EventWorkshop on Human Motion, HUMO 2000 - Austin, United States
Duration: Dec 7 2000Dec 8 2000

Publication series

NameProceedings - Workshop on Human Motion, HUMO 2000

Other

OtherWorkshop on Human Motion, HUMO 2000
CountryUnited States
CityAustin
Period12/7/0012/8/00

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition
  • Signal Processing

Fingerprint Dive into the research topics of 'Real-time human motion analysis and IK-based human figure control'. Together they form a unique fingerprint.

Cite this