TY - JOUR
T1 - Learning geometric and photometric features from panoramic LiDAR scans for outdoor place categorization
AU - Nakashima, Kazuto
AU - Jung, Hojung
AU - Oto, Yuki
AU - Iwashita, Yumi
AU - Kurazume, Ryo
AU - Mozos, Oscar Martinez
N1 - Funding Information:
This work was supported by Japan Society for the Promotion of Science (JSPS KAKENHI) [grant number JP26249029].
Funding Information:
Yumi Iwashita received her M.S. degree and her Ph.D. from the Graduate School of Information Science and Electrical Engineering, Kyushu University in 2004 and 2007, respectively. She was a Research Fellow of the Japan Society for the Promotion of Science (JSPS) from 2006 to 2007. In 2007, she was a PostDoc at Imperial College London under Professor Maria Petrou. From 2007 to 2014, she was an assistant professor at Kyushu University. Between 2011 and 2013, she was a visiting researcher, at the NASA’s Jet Propulsion Laboratory, sponsored by a grant from the JSPS. From 2014 to 2016, she was an associate professor at Kyushu University. Since 2016, she has been a research technologist at Jet Propulsion Laboratory and a visiting associate professor at Kyushu University. Her current research interests include computer vision for robotics and Intelligence, Surveillance, and Reconnaissance (ISR) applications, biometrics and pattern recognition for security systems.
Publisher Copyright:
© 2018, © 2018 Taylor & Francis and The Robotics Society of Japan.
PY - 2018/1/1
Y1 - 2018/1/1
N2 - Semantic place categorization, which is one of the essential tasks for autonomous robots and vehicles, allows them to have capabilities of self-decision and navigation in unfamiliar environments. In particular, outdoor places are more difficult targets than indoor ones due to perceptual variations, such as dynamic illuminance over 24 hours and occlusions by cars and pedestrians. This paper presents a novel method of categorizing outdoor places using convolutional neural networks (CNNs), which take omnidirectional depth/reflectance images obtained by 3D LiDARs as the inputs. First, we construct a large-scale outdoor place dataset named Multi-modal Panoramic 3D Outdoor (MPO) comprising two types of point clouds captured by two different LiDARs. They are labeled with six outdoor place categories: coast, forest, indoor/outdoor parking, residential area, and urban area. Second, we provide CNNs for LiDAR-based outdoor place categorization and evaluate our approach with the MPO dataset. Our results on the MPO dataset outperform traditional approaches and show the effectiveness in which we use both depth and reflectance modalities. To analyze our trained deep networks, we visualize the learned features.
AB - Semantic place categorization, which is one of the essential tasks for autonomous robots and vehicles, allows them to have capabilities of self-decision and navigation in unfamiliar environments. In particular, outdoor places are more difficult targets than indoor ones due to perceptual variations, such as dynamic illuminance over 24 hours and occlusions by cars and pedestrians. This paper presents a novel method of categorizing outdoor places using convolutional neural networks (CNNs), which take omnidirectional depth/reflectance images obtained by 3D LiDARs as the inputs. First, we construct a large-scale outdoor place dataset named Multi-modal Panoramic 3D Outdoor (MPO) comprising two types of point clouds captured by two different LiDARs. They are labeled with six outdoor place categories: coast, forest, indoor/outdoor parking, residential area, and urban area. Second, we provide CNNs for LiDAR-based outdoor place categorization and evaluate our approach with the MPO dataset. Our results on the MPO dataset outperform traditional approaches and show the effectiveness in which we use both depth and reflectance modalities. To analyze our trained deep networks, we visualize the learned features.
UR - http://www.scopus.com/inward/record.url?scp=85051145962&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85051145962&partnerID=8YFLogxK
U2 - 10.1080/01691864.2018.1501279
DO - 10.1080/01691864.2018.1501279
M3 - Article
AN - SCOPUS:85051145962
SN - 0169-1864
VL - 32
SP - 750
EP - 765
JO - Advanced Robotics
JF - Advanced Robotics
IS - 14
ER -