A real-time motion capture system with multiple camera fusion

Satoshi Yonemoto, Asuka Matsumoto, Daisaku Arita, Rin Ichiro Taniguchi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

32 Citations (Scopus)

Abstract

This paper presents a real-time motion capture system of 3D multi-part objects, whose purpose is to do seamless mapping of objects in the real world into virtual environments easily. In general, virtual environment applications such as man-machine seamless interaction require the system to estimate accurate motion parameters at real-time for natural objects such as human bodies. To achieve this requirement, we have been developing a vision-based motion capture system which reconstructs time-varying motion parameters of 3D multi-part objects. The advantage of such a vision-based system is that it is possible to acquire the other scene parameters such as shape and surface properties at the same time, using the same equipment in measuring motion. In this paper, as our first system, we have implemented a color-marker-based motion capture system which realizes multi-view fusion and have demonstrated our motion capture and reconstruction system works at real-time on PC-clusters.

Original languageEnglish
Title of host publicationProceedings - International Conference on Image Analysis and Processing, ICIAP 1999
Pages600-605
Number of pages6
DOIs
Publication statusPublished - 1999
Event10th International Conference on Image Analysis and Processing, ICIAP 1999 - Venice, Italy
Duration: Sept 27 1999Sept 29 1999

Publication series

NameProceedings - International Conference on Image Analysis and Processing, ICIAP 1999

Other

Other10th International Conference on Image Analysis and Processing, ICIAP 1999
Country/TerritoryItaly
CityVenice
Period9/27/999/29/99

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'A real-time motion capture system with multiple camera fusion'. Together they form a unique fingerprint.

Cite this