MU-net: Deep learning-based thermal IR image estimation from RGB image

Yumi Iwashita, Kazuto Nakashima, Sir Rafol, Adrian Stoica, Ryo Kurazume

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Citations (Scopus)

Abstract

Terrain imagery collected by satellite remote sensing or by rover on-board sensors is the primary source for terrain classification used in determining terrain traversibility and mission plans for planetary rovers. Mapping models between RGB and IR for terrain classes are learned from real RGB and IR data examples in the same or similar terrain. This paper adds a new class of deep learning architectures called MU-Net (Multiple U-Net) and shows its efficiency in deriving better RGB-to-IR mapping models, improving over past work the estimation of thermal IR images from incoming RGB images and learned RGB-IR mappings.

Original languageEnglish
Title of host publicationProceedings - 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2019
PublisherIEEE Computer Society
Pages1022-1028
Number of pages7
ISBN (Electronic)9781728125060
DOIs
Publication statusPublished - Jun 2019
Event32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2019 - Long Beach, United States
Duration: Jun 16 2019Jun 20 2019

Publication series

NameIEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops
Volume2019-June
ISSN (Print)2160-7508
ISSN (Electronic)2160-7516

Conference

Conference32nd IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, CVPRW 2019
Country/TerritoryUnited States
CityLong Beach
Period6/16/196/20/19

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'MU-net: Deep learning-based thermal IR image estimation from RGB image'. Together they form a unique fingerprint.

Cite this