Representing a Partially Observed Non-Rigid 3D Human Using Eigen-Texture and Eigen-Deformation

Ryosuke Kimura, Akihiko Sayo, Fabian Lorenzo Dayrit, Yuta Nakashima, Hiroshi Kawasaki, Ambrosio Blanco, Katsushi Ikeuchi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Reconstruction of the shape and motion of humans from RGB-D is a challenging problem, receiving much attention in recent years. Recent approaches for full-body reconstruction use a statistic shape model, which is built upon accurate full-body scans of people in skin-tight clothes, to complete invisible parts due to occlusion. Such a statistic model may still be fit to an RGB-D measurement with loose clothes but cannot describe its deformations, such as clothing wrinkles. Observed surfaces may be reconstructed precisely from actual measurements, while we have no cues for unobserved surfaces. For full-body reconstruction with loose clothes, we propose to use lower dimensional embeddings of texture and deformation referred to as eigen-texturing and eigen-deformation, to reproduce views of even unobserved surfaces. Provided a full-body reconstruction from a sequence of partial measurements as 3D meshes, the texture and deformation of each triangle are then embedded using eigen-decomposition. Combined with neural-network-based coefficient regression, our method synthesizes the texture and deformation from arbitrary viewpoints. We evaluate our method using simulated data and visually demonstrate how our method works on real data.

Original languageEnglish
Title of host publication2018 24th International Conference on Pattern Recognition, ICPR 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1043-1048
Number of pages6
ISBN (Electronic)9781538637883
DOIs
Publication statusPublished - Nov 26 2018
Event24th International Conference on Pattern Recognition, ICPR 2018 - Beijing, China
Duration: Aug 20 2018Aug 24 2018

Publication series

NameProceedings - International Conference on Pattern Recognition
Volume2018-August
ISSN (Print)1051-4651

Other

Other24th International Conference on Pattern Recognition, ICPR 2018
CountryChina
CityBeijing
Period8/20/188/24/18

Fingerprint

Textures
Statistics
Texturing
Skin
Neural networks
Decomposition

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition

Cite this

Kimura, R., Sayo, A., Dayrit, F. L., Nakashima, Y., Kawasaki, H., Blanco, A., & Ikeuchi, K. (2018). Representing a Partially Observed Non-Rigid 3D Human Using Eigen-Texture and Eigen-Deformation. In 2018 24th International Conference on Pattern Recognition, ICPR 2018 (pp. 1043-1048). [8545658] (Proceedings - International Conference on Pattern Recognition; Vol. 2018-August). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICPR.2018.8545658

Representing a Partially Observed Non-Rigid 3D Human Using Eigen-Texture and Eigen-Deformation. / Kimura, Ryosuke; Sayo, Akihiko; Dayrit, Fabian Lorenzo; Nakashima, Yuta; Kawasaki, Hiroshi; Blanco, Ambrosio; Ikeuchi, Katsushi.

2018 24th International Conference on Pattern Recognition, ICPR 2018. Institute of Electrical and Electronics Engineers Inc., 2018. p. 1043-1048 8545658 (Proceedings - International Conference on Pattern Recognition; Vol. 2018-August).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Kimura, R, Sayo, A, Dayrit, FL, Nakashima, Y, Kawasaki, H, Blanco, A & Ikeuchi, K 2018, Representing a Partially Observed Non-Rigid 3D Human Using Eigen-Texture and Eigen-Deformation. in 2018 24th International Conference on Pattern Recognition, ICPR 2018., 8545658, Proceedings - International Conference on Pattern Recognition, vol. 2018-August, Institute of Electrical and Electronics Engineers Inc., pp. 1043-1048, 24th International Conference on Pattern Recognition, ICPR 2018, Beijing, China, 8/20/18. https://doi.org/10.1109/ICPR.2018.8545658
Kimura R, Sayo A, Dayrit FL, Nakashima Y, Kawasaki H, Blanco A et al. Representing a Partially Observed Non-Rigid 3D Human Using Eigen-Texture and Eigen-Deformation. In 2018 24th International Conference on Pattern Recognition, ICPR 2018. Institute of Electrical and Electronics Engineers Inc. 2018. p. 1043-1048. 8545658. (Proceedings - International Conference on Pattern Recognition). https://doi.org/10.1109/ICPR.2018.8545658
Kimura, Ryosuke ; Sayo, Akihiko ; Dayrit, Fabian Lorenzo ; Nakashima, Yuta ; Kawasaki, Hiroshi ; Blanco, Ambrosio ; Ikeuchi, Katsushi. / Representing a Partially Observed Non-Rigid 3D Human Using Eigen-Texture and Eigen-Deformation. 2018 24th International Conference on Pattern Recognition, ICPR 2018. Institute of Electrical and Electronics Engineers Inc., 2018. pp. 1043-1048 (Proceedings - International Conference on Pattern Recognition).
@inproceedings{9c1eef4306bc41f3b5bf277a5b446c68,
title = "Representing a Partially Observed Non-Rigid 3D Human Using Eigen-Texture and Eigen-Deformation",
abstract = "Reconstruction of the shape and motion of humans from RGB-D is a challenging problem, receiving much attention in recent years. Recent approaches for full-body reconstruction use a statistic shape model, which is built upon accurate full-body scans of people in skin-tight clothes, to complete invisible parts due to occlusion. Such a statistic model may still be fit to an RGB-D measurement with loose clothes but cannot describe its deformations, such as clothing wrinkles. Observed surfaces may be reconstructed precisely from actual measurements, while we have no cues for unobserved surfaces. For full-body reconstruction with loose clothes, we propose to use lower dimensional embeddings of texture and deformation referred to as eigen-texturing and eigen-deformation, to reproduce views of even unobserved surfaces. Provided a full-body reconstruction from a sequence of partial measurements as 3D meshes, the texture and deformation of each triangle are then embedded using eigen-decomposition. Combined with neural-network-based coefficient regression, our method synthesizes the texture and deformation from arbitrary viewpoints. We evaluate our method using simulated data and visually demonstrate how our method works on real data.",
author = "Ryosuke Kimura and Akihiko Sayo and Dayrit, {Fabian Lorenzo} and Yuta Nakashima and Hiroshi Kawasaki and Ambrosio Blanco and Katsushi Ikeuchi",
year = "2018",
month = "11",
day = "26",
doi = "10.1109/ICPR.2018.8545658",
language = "English",
series = "Proceedings - International Conference on Pattern Recognition",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "1043--1048",
booktitle = "2018 24th International Conference on Pattern Recognition, ICPR 2018",
address = "United States",

}

TY - GEN

T1 - Representing a Partially Observed Non-Rigid 3D Human Using Eigen-Texture and Eigen-Deformation

AU - Kimura, Ryosuke

AU - Sayo, Akihiko

AU - Dayrit, Fabian Lorenzo

AU - Nakashima, Yuta

AU - Kawasaki, Hiroshi

AU - Blanco, Ambrosio

AU - Ikeuchi, Katsushi

PY - 2018/11/26

Y1 - 2018/11/26

N2 - Reconstruction of the shape and motion of humans from RGB-D is a challenging problem, receiving much attention in recent years. Recent approaches for full-body reconstruction use a statistic shape model, which is built upon accurate full-body scans of people in skin-tight clothes, to complete invisible parts due to occlusion. Such a statistic model may still be fit to an RGB-D measurement with loose clothes but cannot describe its deformations, such as clothing wrinkles. Observed surfaces may be reconstructed precisely from actual measurements, while we have no cues for unobserved surfaces. For full-body reconstruction with loose clothes, we propose to use lower dimensional embeddings of texture and deformation referred to as eigen-texturing and eigen-deformation, to reproduce views of even unobserved surfaces. Provided a full-body reconstruction from a sequence of partial measurements as 3D meshes, the texture and deformation of each triangle are then embedded using eigen-decomposition. Combined with neural-network-based coefficient regression, our method synthesizes the texture and deformation from arbitrary viewpoints. We evaluate our method using simulated data and visually demonstrate how our method works on real data.

AB - Reconstruction of the shape and motion of humans from RGB-D is a challenging problem, receiving much attention in recent years. Recent approaches for full-body reconstruction use a statistic shape model, which is built upon accurate full-body scans of people in skin-tight clothes, to complete invisible parts due to occlusion. Such a statistic model may still be fit to an RGB-D measurement with loose clothes but cannot describe its deformations, such as clothing wrinkles. Observed surfaces may be reconstructed precisely from actual measurements, while we have no cues for unobserved surfaces. For full-body reconstruction with loose clothes, we propose to use lower dimensional embeddings of texture and deformation referred to as eigen-texturing and eigen-deformation, to reproduce views of even unobserved surfaces. Provided a full-body reconstruction from a sequence of partial measurements as 3D meshes, the texture and deformation of each triangle are then embedded using eigen-decomposition. Combined with neural-network-based coefficient regression, our method synthesizes the texture and deformation from arbitrary viewpoints. We evaluate our method using simulated data and visually demonstrate how our method works on real data.

UR - http://www.scopus.com/inward/record.url?scp=85059761522&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85059761522&partnerID=8YFLogxK

U2 - 10.1109/ICPR.2018.8545658

DO - 10.1109/ICPR.2018.8545658

M3 - Conference contribution

AN - SCOPUS:85059761522

T3 - Proceedings - International Conference on Pattern Recognition

SP - 1043

EP - 1048

BT - 2018 24th International Conference on Pattern Recognition, ICPR 2018

PB - Institute of Electrical and Electronics Engineers Inc.

ER -