TY - GEN
T1 - Multimodal recording system for collecting facial and postural data in a group meeting
AU - Soneda, Yusuke
AU - Matsuda, Yuki
AU - Arakawa, Yutaka
AU - Yasumoto, Keiichi
PY - 2019/11/19
Y1 - 2019/11/19
N2 - By the spread of active learning and group work, the ability to collaborate and discuss among the participants becomes more important than before. Although several studies have reported on that micro facial expressions and body movements give psychological effects to others during conversation, most of them are lacking in quantitative evaluation and there are few datasets about group discussion. In this research, we proposed a highly reproducible system that helps to make datasets of group discussions with multiple devices such as an omnidirectional camera (360-degree camera), an eye tracker and a motion sensor. Our system operates those devices in one-stop to realizing synchronized recording. To confirm the feasibility, we built the proposed system with an omnidirectional camera, 4 eye trackers, and 4 motion sensors. Finally, we succeeded to make a dataset by recording 8 times group meeting by using our developed system easily.
AB - By the spread of active learning and group work, the ability to collaborate and discuss among the participants becomes more important than before. Although several studies have reported on that micro facial expressions and body movements give psychological effects to others during conversation, most of them are lacking in quantitative evaluation and there are few datasets about group discussion. In this research, we proposed a highly reproducible system that helps to make datasets of group discussions with multiple devices such as an omnidirectional camera (360-degree camera), an eye tracker and a motion sensor. Our system operates those devices in one-stop to realizing synchronized recording. To confirm the feasibility, we built the proposed system with an omnidirectional camera, 4 eye trackers, and 4 motion sensors. Finally, we succeeded to make a dataset by recording 8 times group meeting by using our developed system easily.
UR - http://www.scopus.com/inward/record.url?scp=85077701698&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85077701698&partnerID=8YFLogxK
M3 - Conference contribution
T3 - ICCE 2019 - 27th International Conference on Computers in Education, Proceedings
SP - 466
EP - 471
BT - ICCE 2019 - 27th International Conference on Computers in Education, Proceedings
A2 - Chang, Maiga
A2 - So, Hyo-Jeong
A2 - Wong, Lung-Hsiang
A2 - Yu, Fu-Yun
A2 - Shih, Ju-Ling
A2 - Boticki, Ivica
A2 - Chen, Ming-Puu
A2 - Dewan, Ali
A2 - Haklev, Stian
A2 - Koh, Elizabeth
A2 - Kojiri, Tomoko
A2 - Li, Kuo-Chen
A2 - Sun, Daner
A2 - Wen, Yun
PB - Asia-Pacific Society for Computers in Education
T2 - 27th International Conference on Computers in Education, ICCE 2019
Y2 - 2 December 2019 through 6 December 2019
ER -