Real-time foreground segmentation from moving camera based on case-based trajectory classification

Yosuke Nonaka, Atsushi Shimada, Hajime Nagahara, Rin-Ichiro Taniguchi

Research output: Contribution to conferencePaperpeer-review

5 Citations (Scopus)

Abstract

Recently, several methods for foreground segmentation from moving camera have been proposed. A trajectory-based method is one of typical approaches to segment video frames into foreground and background regions. The method obtains long term trajectories from entire of video frame and segments them by learning pixel or motion based object features. However, it often needs large amount of computational cost and memory resource to maintain trajectories. We present a trajectory-based method which aims for real-time foreground segmentation from moving camera. Unlike conventional methods, we use trajectories which are sparsely obtained from two successive video frames. In addition, our method enables using spatio-temporal feature of trajectories by introducing case-based approach to improve detection results. We compare our method with previous approaches and show results on challenging video sequences.

Original languageEnglish
Pages808-812
Number of pages5
DOIs
Publication statusPublished - Jan 1 2013
Event2013 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013 - Naha, Okinawa, Japan
Duration: Nov 5 2013Nov 8 2013

Other

Other2013 2nd IAPR Asian Conference on Pattern Recognition, ACPR 2013
CountryJapan
CityNaha, Okinawa
Period11/5/1311/8/13

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition

Fingerprint Dive into the research topics of 'Real-time foreground segmentation from moving camera based on case-based trajectory classification'. Together they form a unique fingerprint.

Cite this