TY - JOUR
T1 - TIDE
T2 - Temporally Incremental Disparity Estimation via Pattern Flow in Structured Light System
AU - Qiao, Rukun
AU - Kawasaki, Hiroshi
AU - Zha, Hongbin
N1 - Funding Information:
The authors are grateful for the financial support provided by the National Natural Science Foundation of China [A Study of Multi-layer Commingling with Intelligent Well Completion for the Optimization of the Node Combination Model (No. 51274165) and A Study of Intelligent Well System Design and the Production Optimization of a Control Model (No. U1262105)] and the Education Department of Shaanxi Provincial Government Special Scientific Research Projects in China [Water Invasion Monitoring and Optimization Control Model of a Horizontal Intelligent Well in a Low-Permeability Reservoir (No. 14JK1587)].
Publisher Copyright:
© 2016 IEEE.
PY - 2022/4/1
Y1 - 2022/4/1
N2 - We introduced Temporally Incremental Disparity Estimation Network (TIDE-Net), a learning-based technique for disparity computation in mono-camera structured light systems. In our hardware setting, a static pattern is projected onto a dynamic scene and captured by a monocular camera. Different from most former disparity estimation methods that operate in a frame-wise manner, our network acquires disparity maps in a temporally incremental way. Specifically, We exploit the deformation of projected patterns (named pattern flow) on captured image sequences, to model the temporal information. Notably, this newly proposed pattern flow formulation reflects the disparity changes along the epipolar line, which is a special form of optical flow. Tailored for pattern flow, the TIDE-Net, a recurrent architecture, is proposed and implemented. For each incoming frame, our model fuses correlation volumes (from current frame) and disparity (from former frame) warped by pattern flow. From fused features, the final stage of TIDE-Net estimates the residual disparity rather than the full disparity, as conducted by many previous methods. Interestingly, this design brings clear empirical advantages in terms of efficiency and generalization ability. Using only synthetic data for training, our extensitve evaluation results (w.r.t. both accuracy and efficienty metrics) show superior performance than several SOTA models on unseen real data.
AB - We introduced Temporally Incremental Disparity Estimation Network (TIDE-Net), a learning-based technique for disparity computation in mono-camera structured light systems. In our hardware setting, a static pattern is projected onto a dynamic scene and captured by a monocular camera. Different from most former disparity estimation methods that operate in a frame-wise manner, our network acquires disparity maps in a temporally incremental way. Specifically, We exploit the deformation of projected patterns (named pattern flow) on captured image sequences, to model the temporal information. Notably, this newly proposed pattern flow formulation reflects the disparity changes along the epipolar line, which is a special form of optical flow. Tailored for pattern flow, the TIDE-Net, a recurrent architecture, is proposed and implemented. For each incoming frame, our model fuses correlation volumes (from current frame) and disparity (from former frame) warped by pattern flow. From fused features, the final stage of TIDE-Net estimates the residual disparity rather than the full disparity, as conducted by many previous methods. Interestingly, this design brings clear empirical advantages in terms of efficiency and generalization ability. Using only synthetic data for training, our extensitve evaluation results (w.r.t. both accuracy and efficienty metrics) show superior performance than several SOTA models on unseen real data.
UR - http://www.scopus.com/inward/record.url?scp=85124712791&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85124712791&partnerID=8YFLogxK
U2 - 10.1109/LRA.2022.3150029
DO - 10.1109/LRA.2022.3150029
M3 - Article
AN - SCOPUS:85124712791
VL - 7
SP - 5111
EP - 5118
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
SN - 2377-3766
IS - 2
ER -