Avatar motion control by user body postures

Satoshi Yonemoto, Hiroshi Nakano, Rin Ichiro Taniguchi

Research output: Contribution to conferencePaperpeer-review

1 Citation (Scopus)

Abstract

This paper describes an avatar motion control by body postures. Our goal is to do seamless mapping of human motion in the real world into virtual environments. We hope that the idea of direct human motion sensing will be used on future interfaces. With the aim of making computing systems suited for users, we have developed a computer vision based avatar motion control. The human motion sensing is based on skin-color blob tracking. Our method can generate realistic avatar motion from the sensing data. We address our framework to use virtual scene context as a priori knowledge. We assume that virtual objects in virtual environments can afford avatar's action, that is, the virtual environments provide action information for the avatar. Avatar's motion is controlled, based on simulating the idea of affordance extended into the virtual environments.

Original languageEnglish
Pages347-350
Number of pages4
DOIs
Publication statusPublished - 2003
Event2003 Multimedia Conference - Proceedings of the 11th ACM International Conference on Multimedia, MM'03 - Berkeley, CA., United States
Duration: Nov 4 2003Nov 6 2003

Other

Other2003 Multimedia Conference - Proceedings of the 11th ACM International Conference on Multimedia, MM'03
CountryUnited States
CityBerkeley, CA.
Period11/4/0311/6/03

All Science Journal Classification (ASJC) codes

  • Computer Science(all)

Fingerprint Dive into the research topics of 'Avatar motion control by user body postures'. Together they form a unique fingerprint.

Cite this