Realtime novel view synthesis with eigen-texture regression

Yuta Nakashima, Fumio Okura, Norihiko Kawai, Hiroshi Kawasaki, Ambrosio Blanco, Katsushi Ikeuchi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

Realtime novel view synthesis, which generates a novel view of a real object or scene in realtime, enjoys a wide range of applications including augmented reality, telepresence, and immersive telecommunication. Image-based rendering (IBR) with rough geometry can be done using only an off-the-shelf camera and thus can be used by many users. However, IBR from images in the wild (e.g., lighting condition changes or the scene contains objects with specular surfaces) has been a tough problem due to color discontinuity; IBR with rough geometry picks up appropriate images for a given viewpoint, but the image used for a rendering unit (a face or pixel) switches when the viewpoint moves, which may cause noticeable changes in color. We use the eigen-texture technique, which represents images for a certain face using a point in the eigenspace. We propose to regress a new point in this space, which moves smoothly, given a viewpoint so that we can generate an image whose color smoothly changes according to the point. Our regressor is based on a neural network with a single hidden layer and hyperbolic tangent nonlinearity. We demonstrate the advantages of our IBR approach using our own datasets as well as publicly available datasets for comparison.

Original languageEnglish
Title of host publicationBritish Machine Vision Conference 2017, BMVC 2017
PublisherBMVA Press
ISBN (Electronic)190172560X, 9781901725605
Publication statusPublished - Jan 1 2017
Event28th British Machine Vision Conference, BMVC 2017 - London, United Kingdom
Duration: Sep 4 2017Sep 7 2017

Publication series

NameBritish Machine Vision Conference 2017, BMVC 2017

Conference

Conference28th British Machine Vision Conference, BMVC 2017
CountryUnited Kingdom
CityLondon
Period9/4/179/7/17

Fingerprint

Textures
Color
Geometry
Augmented reality
Telecommunication
Lighting
Pixels
Cameras
Switches
Neural networks

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition

Cite this

Nakashima, Y., Okura, F., Kawai, N., Kawasaki, H., Blanco, A., & Ikeuchi, K. (2017). Realtime novel view synthesis with eigen-texture regression. In British Machine Vision Conference 2017, BMVC 2017 (British Machine Vision Conference 2017, BMVC 2017). BMVA Press.

Realtime novel view synthesis with eigen-texture regression. / Nakashima, Yuta; Okura, Fumio; Kawai, Norihiko; Kawasaki, Hiroshi; Blanco, Ambrosio; Ikeuchi, Katsushi.

British Machine Vision Conference 2017, BMVC 2017. BMVA Press, 2017. (British Machine Vision Conference 2017, BMVC 2017).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Nakashima, Y, Okura, F, Kawai, N, Kawasaki, H, Blanco, A & Ikeuchi, K 2017, Realtime novel view synthesis with eigen-texture regression. in British Machine Vision Conference 2017, BMVC 2017. British Machine Vision Conference 2017, BMVC 2017, BMVA Press, 28th British Machine Vision Conference, BMVC 2017, London, United Kingdom, 9/4/17.
Nakashima Y, Okura F, Kawai N, Kawasaki H, Blanco A, Ikeuchi K. Realtime novel view synthesis with eigen-texture regression. In British Machine Vision Conference 2017, BMVC 2017. BMVA Press. 2017. (British Machine Vision Conference 2017, BMVC 2017).
Nakashima, Yuta ; Okura, Fumio ; Kawai, Norihiko ; Kawasaki, Hiroshi ; Blanco, Ambrosio ; Ikeuchi, Katsushi. / Realtime novel view synthesis with eigen-texture regression. British Machine Vision Conference 2017, BMVC 2017. BMVA Press, 2017. (British Machine Vision Conference 2017, BMVC 2017).
@inproceedings{a2ab48d3720043769d9230bc39402838,
title = "Realtime novel view synthesis with eigen-texture regression",
abstract = "Realtime novel view synthesis, which generates a novel view of a real object or scene in realtime, enjoys a wide range of applications including augmented reality, telepresence, and immersive telecommunication. Image-based rendering (IBR) with rough geometry can be done using only an off-the-shelf camera and thus can be used by many users. However, IBR from images in the wild (e.g., lighting condition changes or the scene contains objects with specular surfaces) has been a tough problem due to color discontinuity; IBR with rough geometry picks up appropriate images for a given viewpoint, but the image used for a rendering unit (a face or pixel) switches when the viewpoint moves, which may cause noticeable changes in color. We use the eigen-texture technique, which represents images for a certain face using a point in the eigenspace. We propose to regress a new point in this space, which moves smoothly, given a viewpoint so that we can generate an image whose color smoothly changes according to the point. Our regressor is based on a neural network with a single hidden layer and hyperbolic tangent nonlinearity. We demonstrate the advantages of our IBR approach using our own datasets as well as publicly available datasets for comparison.",
author = "Yuta Nakashima and Fumio Okura and Norihiko Kawai and Hiroshi Kawasaki and Ambrosio Blanco and Katsushi Ikeuchi",
year = "2017",
month = "1",
day = "1",
language = "English",
series = "British Machine Vision Conference 2017, BMVC 2017",
publisher = "BMVA Press",
booktitle = "British Machine Vision Conference 2017, BMVC 2017",

}

TY - GEN

T1 - Realtime novel view synthesis with eigen-texture regression

AU - Nakashima, Yuta

AU - Okura, Fumio

AU - Kawai, Norihiko

AU - Kawasaki, Hiroshi

AU - Blanco, Ambrosio

AU - Ikeuchi, Katsushi

PY - 2017/1/1

Y1 - 2017/1/1

N2 - Realtime novel view synthesis, which generates a novel view of a real object or scene in realtime, enjoys a wide range of applications including augmented reality, telepresence, and immersive telecommunication. Image-based rendering (IBR) with rough geometry can be done using only an off-the-shelf camera and thus can be used by many users. However, IBR from images in the wild (e.g., lighting condition changes or the scene contains objects with specular surfaces) has been a tough problem due to color discontinuity; IBR with rough geometry picks up appropriate images for a given viewpoint, but the image used for a rendering unit (a face or pixel) switches when the viewpoint moves, which may cause noticeable changes in color. We use the eigen-texture technique, which represents images for a certain face using a point in the eigenspace. We propose to regress a new point in this space, which moves smoothly, given a viewpoint so that we can generate an image whose color smoothly changes according to the point. Our regressor is based on a neural network with a single hidden layer and hyperbolic tangent nonlinearity. We demonstrate the advantages of our IBR approach using our own datasets as well as publicly available datasets for comparison.

AB - Realtime novel view synthesis, which generates a novel view of a real object or scene in realtime, enjoys a wide range of applications including augmented reality, telepresence, and immersive telecommunication. Image-based rendering (IBR) with rough geometry can be done using only an off-the-shelf camera and thus can be used by many users. However, IBR from images in the wild (e.g., lighting condition changes or the scene contains objects with specular surfaces) has been a tough problem due to color discontinuity; IBR with rough geometry picks up appropriate images for a given viewpoint, but the image used for a rendering unit (a face or pixel) switches when the viewpoint moves, which may cause noticeable changes in color. We use the eigen-texture technique, which represents images for a certain face using a point in the eigenspace. We propose to regress a new point in this space, which moves smoothly, given a viewpoint so that we can generate an image whose color smoothly changes according to the point. Our regressor is based on a neural network with a single hidden layer and hyperbolic tangent nonlinearity. We demonstrate the advantages of our IBR approach using our own datasets as well as publicly available datasets for comparison.

UR - http://www.scopus.com/inward/record.url?scp=85059781925&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85059781925&partnerID=8YFLogxK

M3 - Conference contribution

AN - SCOPUS:85059781925

T3 - British Machine Vision Conference 2017, BMVC 2017

BT - British Machine Vision Conference 2017, BMVC 2017

PB - BMVA Press

ER -