Real-time foreground segmentation from moving camera based on case-based trajectory classification

Yosuke Nonaka, Atsushi Shimada, Hajime Nagahara, Rin Ichiro Taniguchi

研究成果: Contribution to conferencePaper査読

5 被引用数 (Scopus)

抄録

Recently, several methods for foreground segmentation from moving camera have been proposed. A trajectory-based method is one of typical approaches to segment video frames into foreground and background regions. The method obtains long term trajectories from entire of video frame and segments them by learning pixel or motion based object features. However, it often needs large amount of computational cost and memory resource to maintain trajectories. We present a trajectory-based method which aims for real-time foreground segmentation from moving camera. Unlike conventional methods, we use trajectories which are sparsely obtained from two successive video frames. In addition, our method enables using spatio-temporal feature of trajectories by introducing case-based approach to improve detection results. We compare our method with previous approaches and show results on challenging video sequences.

本文言語英語
ページ808-812
ページ数5
DOI
出版ステータス出版済み - 2013
イベント2013 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013 - Naha, Okinawa, 日本
継続期間: 11 5 201311 8 2013

その他

その他2013 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013
国/地域日本
CityNaha, Okinawa
Period11/5/1311/8/13

All Science Journal Classification (ASJC) codes

  • コンピュータ ビジョンおよびパターン認識

フィンガープリント

「Real-time foreground segmentation from moving camera based on case-based trajectory classification」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル