Kitchen scene context based gesture recognition: A contest in ICPR2012

Atsushi Shimada, Kazuaki Kondo, Daisuke Deguchi, Géraldine Morin, Helman Stern

Research output: Chapter in Book/Report/Conference proceedingConference contribution

24 Citations (Scopus)

Abstract

This paper introduces a new open dataset "Actions for Cooking Eggs (ACE) Dataset" and summarizes results of the contest on "Kitchen Scene Context based Gesture Recognition", in conjunction with ICPR2012. The dataset consists of naturally performed actions in a kitchen environment. Five kinds of cooking menus were actually performed by five different actors, and the cooking actions were recorded by a Kinect Sensor. Color image sequences and depth image sequences are both available. Besides, action label was given to each frame. To estimate the action label, action recognition method has to analyze not only actor's action, but also scene contexts such as ingredients and cooking utensils. We compare the submitted algorithms and the results in this paper.

Original languageEnglish
Title of host publicationAdvances in Depth Image Analysis and Applications - International Workshop, WDIA 2012, Selected and Invited Papers
Pages168-185
Number of pages18
DOIs
Publication statusPublished - Sep 5 2013
EventInternational Workshop on Advances in Depth Image Analysis and Applications, WDIA 2012 - Tsukuba, Japan
Duration: Nov 11 2012Nov 11 2012

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume7854 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Other

OtherInternational Workshop on Advances in Depth Image Analysis and Applications, WDIA 2012
CountryJapan
CityTsukuba
Period11/11/1211/11/12

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Kitchen scene context based gesture recognition: A contest in ICPR2012'. Together they form a unique fingerprint.

Cite this