Texture synthesis for stable planar tracking

Clément Glédel, Hideaki Uchiyama, Yuji Oyamada, Rin ichiro Taniguchi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We propose a texture synthesis method to enhance the trackability of a target planar object by embedding natural features into the object in the object design process. To transform an input object into an easy-to-track object in the design process, we extend an inpainting method for naturally embedding the features into the texture. First, a feature-less region in an input object is extracted based on feature distribution based segmentation. Then, the region is filled by using an inpainting method with a feature-rich region searched in an object database. By using context based region search, the inpainted region can be consistent in terms of the object context while improving the feature distribution.

Original languageEnglish
Title of host publicationProceedings - VRST 2018
Subtitle of host publication24th ACM Symposium on Virtual Reality Software and Technology
EditorsStephen N. Spencer
PublisherAssociation for Computing Machinery
ISBN (Electronic)9781450360869
DOIs
Publication statusPublished - Nov 28 2018
Event24th ACM Symposium on Virtual Reality Software and Technology, VRST 2018 - Tokyo, Japan
Duration: Nov 28 2018Dec 1 2018

Publication series

NameProceedings of the ACM Symposium on Virtual Reality Software and Technology, VRST

Other

Other24th ACM Symposium on Virtual Reality Software and Technology, VRST 2018
Country/TerritoryJapan
CityTokyo
Period11/28/1812/1/18

All Science Journal Classification (ASJC) codes

  • Software

Fingerprint

Dive into the research topics of 'Texture synthesis for stable planar tracking'. Together they form a unique fingerprint.

Cite this