Similar gait action recognition using an inertial sensor

Trung Thanh Ngo, Yasushi Makihara, Hajime Nagahara, Yasuhiro Mukaigawa, Yasushi Yagi

Research output: Contribution to journalArticle

53 Citations (Scopus)

Abstract

This paper tackles a challenging problem of inertial sensor-based recognition for similar gait action classes (such as walking on flat ground, up/down stairs, and up/down a slope). We solve three drawbacks of existing methods in the case of gait actions: the action signal segmentation, the sensor orientation inconsistency, and the recognition of similar action classes. First, to robustly segment the walking action under drastic changes in various factors such as speed, intensity, style, and sensor orientation of different participants, we rely on the likelihood of heel strike computed employing a scale-space technique. Second, to solve the problem of 3D sensor orientation inconsistency when matching the signals captured at different sensor orientations, we correct the sensor's tilt before applying an orientation-compensative matching algorithm to solve the remaining angle. Third, to accurately classify similar actions, we incorporate the interclass relationship in the feature vector for recognition. In experiments, the proposed algorithms were positively validated with 460 participants (the largest number in the research field), and five similar gait action classes (namely walking on flat ground, up/down stairs, and up/down a slope) captured by three inertial sensors at different positions (center, left, and right) and orientations on the participant's waist.

Original languageEnglish
Pages (from-to)1289-1301
Number of pages13
JournalPattern Recognition
Volume48
Issue number4
DOIs
Publication statusPublished - Apr 1 2015

All Science Journal Classification (ASJC) codes

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Similar gait action recognition using an inertial sensor'. Together they form a unique fingerprint.

Cite this