Simultaneous estimation of projector and camera poses for multiple oneshot scan using pixel-wise correspondences estimated by U-Nets and GCN

Ryo Furukawa, Michihiro Mikamo, Hiroshi Kawasaki, Ryusuke Sagawa, Shiro Oka, Takahiro Kotachi, Yuki Okamoto, Shinji Tanaka

Research output: Contribution to journalArticlepeer-review

Abstract

Dense and accurate 3D shape acquisition of objects by active-stereo technique has been an important research topic and intensively researched. One of the promising fields for active-stereo techniques is medical applications, such as 3D endoscope systems. In such systems, since a sensor is dynamically moved during the operation, single-frame shape reconstruction, a.k.a. oneshot scan, is necessary. For oneshot scan, there are several open problems, such as low resolution because of spatial coding, and unstable correspondence estimation between the detected patterns and the projected pattern because of irregular reflection. In this paper, we propose a solution for those problems. To increase the resolution, an accurate and stable interpolation method based on deep neural networks (DNNs) is proposed. Since most patterns used for oneshot scan are periodic, pixel-wise phase estimation can be achieved by detecting repetition in the pattern. A graph convolutional network (GCN), which is a deep neural network for graphs, is used for the correspondence problem. In the experiment, pixel-wise shape reconstruction results, as well as robust correspondence estimation using DNNs and a GCN, are shown. In addition, the effectiveness of the techniques is confirmed by comparing the proposed method with existing methods.

All Science Journal Classification (ASJC) codes

  • Computational Mechanics
  • Biomedical Engineering
  • Radiology Nuclear Medicine and imaging
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Simultaneous estimation of projector and camera poses for multiple oneshot scan using pixel-wise correspondences estimated by U-Nets and GCN'. Together they form a unique fingerprint.

Cite this