Automatic route video summarization based on image analysis for intuitive touristic experience

Yuki Kanaya, Shogo Kawanaka, Hirohiko Suwa, Yutaka Arakawa, Keiichi Yasumoto

Research output: Contribution to journalArticlepeer-review

Abstract

Currently, many tourists search for and watch tourism videos on the Internet when planning a sightseeing tour. In order to quickly plan a sightseeing route, a shorter playback time of tourism videos is desirable. For this purpose, time-lapse playback would be effective. However, the faster the playback is, the lower the degree of comprehension of the viewers will be. In this paper, we propose a novel time-lapse-based video summarization method without the substantial loss of information important for viewers to plan a tour route. In the proposed method, we focus on scene changes in the video. We extract scenes with a certain level of change compared with previous scenes as important (slowly played back) in the summarized video, while other scenes are fast-forwarded. We investigated the appropriate playback speed of sightseeing videos. As a result of a questionnaire, we found that a playback speed between ×4 and ×8 was the most effective for viewers to understand the sightseeing information for tour route planning. In addition, to evaluate the effectiveness of our proposed method, we conducted experiments with 20 participants using a sightseeing video taken in Kyoto. Comparing the video summarized with our method and that summarized manually (by voting for necessary/ unnecessary scenes), our method identified the important scenes with an F-measure of 62.22%.

Original languageEnglish
Pages (from-to)599-610
Number of pages12
JournalSensors and Materials
Volume32
Issue number2
DOIs
Publication statusPublished - 2020

All Science Journal Classification (ASJC) codes

  • Materials Science(all)
  • Instrumentation

Fingerprint

Dive into the research topics of 'Automatic route video summarization based on image analysis for intuitive touristic experience'. Together they form a unique fingerprint.

Cite this