Real-time visually guided human figure control using IK-based motion synthesis

S. Yonemoto, D. Arita, R. I. Taniguchi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

This paper presents a real-time human figure motion control method using color blob tracking, and human motion synthesis based on real-time inverse kinematics. Our purpose is to do seamless mapping of human motion in the real world into virtual environments. In general, virtual environment applications such as man-machine 'smart' interaction require real-time human full-body motion capturing systems without special devices or markers. However, since such vision-based human motion capturing systems are essentially unstable and can only acquire partial information because of self-occlusion, we have to introduce a robust pose estimation strategy, or an appropriate human motion synthesis based on motion filtering. In this paper, we have demonstrated a real-time and online real-virtual interaction system which realizes human full-body motion capturing and synthesis.

Original languageEnglish
Title of host publicationProceedings - 5th IEEE Workshop on Applications of Computer Vision, WACV 2000
PublisherIEEE Computer Society
Pages194-200
Number of pages7
ISBN (Electronic)0769508138
DOIs
Publication statusPublished - 2000
Event5th IEEE Workshop on Applications of Computer Vision, WACV 2000 - Palm Springs, United States
Duration: Dec 4 2000Dec 6 2000

Publication series

NameProceedings of IEEE Workshop on Applications of Computer Vision
Volume2000-January
ISSN (Print)2158-3978
ISSN (Electronic)2158-3986

Other

Other5th IEEE Workshop on Applications of Computer Vision, WACV 2000
CountryUnited States
CityPalm Springs
Period12/4/0012/6/00

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition
  • Computer Science Applications

Fingerprint Dive into the research topics of 'Real-time visually guided human figure control using IK-based motion synthesis'. Together they form a unique fingerprint.

Cite this