EEG-controlled meal assistance robot with camera-based automatic mouth position tracking and mouth open detection

Chamika Janith Perera, Thilina Dulantha Lalitharatne, Kazuo Kiguchi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

9 Citations (Scopus)

Abstract

A Meal Assistance Robot is an assistive device that is used to aid individuals who cannot independently direct food to their mouths for consuming. For individuals who undergo loss of upper limb functions due to amputations, spinal cord injuries or cerebral palsy, self-feeding can be impossible, and to assist such individuals in regaining their independence meal assistance robots have been introduced. In this paper we propose a meal assistance robot that is controlled using user intentions based on Electroencephalography (EEG) signals while incorporating camera-based automatic mouth position tracking and mouth open detection systems. In the proposed system, users select any solid food item that they desire to consume from three different containers by looking at corresponding flickering LED matrices. User intentions are identified through EEG signals using a Steady State Visual Evoked Potentials (SSVEP) based intention detection method. Initial motion commands for scooping food from the containers are generated and sent to the meal assistance robot from this first stage. At the second stage, a camera-based mouth position tracking method is proposed for automatically detecting the user's mouth position and thereby moving the spoon or end-effector of the meal assistance robot towards the mouth of the user. This method is capable of automatically tracking the mouth position of users irrespective of their individual body differences and seating positions. A mouth open/closed recognition method is implemented at the final stage in order to feed food to the users when they desire consumption, indicated by the opening/closing of their mouth. A set of experiments were carried out with healthy subjects to validate the proposed system and results are here presented.

Original languageEnglish
Title of host publicationICRA 2017 - IEEE International Conference on Robotics and Automation
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1760-1765
Number of pages6
ISBN (Electronic)9781509046331
DOIs
Publication statusPublished - Jul 21 2017
Event2017 IEEE International Conference on Robotics and Automation, ICRA 2017 - Singapore, Singapore
Duration: May 29 2017Jun 3 2017

Publication series

NameProceedings - IEEE International Conference on Robotics and Automation
ISSN (Print)1050-4729

Other

Other2017 IEEE International Conference on Robotics and Automation, ICRA 2017
CountrySingapore
CitySingapore
Period5/29/176/3/17

Fingerprint

Electroencephalography
Cameras
Robots
Containers
Flickering
Bioelectric potentials
End effectors
Light emitting diodes
Experiments

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Artificial Intelligence
  • Electrical and Electronic Engineering

Cite this

Perera, C. J., Lalitharatne, T. D., & Kiguchi, K. (2017). EEG-controlled meal assistance robot with camera-based automatic mouth position tracking and mouth open detection. In ICRA 2017 - IEEE International Conference on Robotics and Automation (pp. 1760-1765). [7989208] (Proceedings - IEEE International Conference on Robotics and Automation). Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICRA.2017.7989208

EEG-controlled meal assistance robot with camera-based automatic mouth position tracking and mouth open detection. / Perera, Chamika Janith; Lalitharatne, Thilina Dulantha; Kiguchi, Kazuo.

ICRA 2017 - IEEE International Conference on Robotics and Automation. Institute of Electrical and Electronics Engineers Inc., 2017. p. 1760-1765 7989208 (Proceedings - IEEE International Conference on Robotics and Automation).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Perera, CJ, Lalitharatne, TD & Kiguchi, K 2017, EEG-controlled meal assistance robot with camera-based automatic mouth position tracking and mouth open detection. in ICRA 2017 - IEEE International Conference on Robotics and Automation., 7989208, Proceedings - IEEE International Conference on Robotics and Automation, Institute of Electrical and Electronics Engineers Inc., pp. 1760-1765, 2017 IEEE International Conference on Robotics and Automation, ICRA 2017, Singapore, Singapore, 5/29/17. https://doi.org/10.1109/ICRA.2017.7989208
Perera CJ, Lalitharatne TD, Kiguchi K. EEG-controlled meal assistance robot with camera-based automatic mouth position tracking and mouth open detection. In ICRA 2017 - IEEE International Conference on Robotics and Automation. Institute of Electrical and Electronics Engineers Inc. 2017. p. 1760-1765. 7989208. (Proceedings - IEEE International Conference on Robotics and Automation). https://doi.org/10.1109/ICRA.2017.7989208
Perera, Chamika Janith ; Lalitharatne, Thilina Dulantha ; Kiguchi, Kazuo. / EEG-controlled meal assistance robot with camera-based automatic mouth position tracking and mouth open detection. ICRA 2017 - IEEE International Conference on Robotics and Automation. Institute of Electrical and Electronics Engineers Inc., 2017. pp. 1760-1765 (Proceedings - IEEE International Conference on Robotics and Automation).
@inproceedings{72262b0b82554fa2a75ffd519731352a,
title = "EEG-controlled meal assistance robot with camera-based automatic mouth position tracking and mouth open detection",
abstract = "A Meal Assistance Robot is an assistive device that is used to aid individuals who cannot independently direct food to their mouths for consuming. For individuals who undergo loss of upper limb functions due to amputations, spinal cord injuries or cerebral palsy, self-feeding can be impossible, and to assist such individuals in regaining their independence meal assistance robots have been introduced. In this paper we propose a meal assistance robot that is controlled using user intentions based on Electroencephalography (EEG) signals while incorporating camera-based automatic mouth position tracking and mouth open detection systems. In the proposed system, users select any solid food item that they desire to consume from three different containers by looking at corresponding flickering LED matrices. User intentions are identified through EEG signals using a Steady State Visual Evoked Potentials (SSVEP) based intention detection method. Initial motion commands for scooping food from the containers are generated and sent to the meal assistance robot from this first stage. At the second stage, a camera-based mouth position tracking method is proposed for automatically detecting the user's mouth position and thereby moving the spoon or end-effector of the meal assistance robot towards the mouth of the user. This method is capable of automatically tracking the mouth position of users irrespective of their individual body differences and seating positions. A mouth open/closed recognition method is implemented at the final stage in order to feed food to the users when they desire consumption, indicated by the opening/closing of their mouth. A set of experiments were carried out with healthy subjects to validate the proposed system and results are here presented.",
author = "Perera, {Chamika Janith} and Lalitharatne, {Thilina Dulantha} and Kazuo Kiguchi",
year = "2017",
month = "7",
day = "21",
doi = "10.1109/ICRA.2017.7989208",
language = "English",
series = "Proceedings - IEEE International Conference on Robotics and Automation",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
pages = "1760--1765",
booktitle = "ICRA 2017 - IEEE International Conference on Robotics and Automation",
address = "United States",

}

TY - GEN

T1 - EEG-controlled meal assistance robot with camera-based automatic mouth position tracking and mouth open detection

AU - Perera, Chamika Janith

AU - Lalitharatne, Thilina Dulantha

AU - Kiguchi, Kazuo

PY - 2017/7/21

Y1 - 2017/7/21

N2 - A Meal Assistance Robot is an assistive device that is used to aid individuals who cannot independently direct food to their mouths for consuming. For individuals who undergo loss of upper limb functions due to amputations, spinal cord injuries or cerebral palsy, self-feeding can be impossible, and to assist such individuals in regaining their independence meal assistance robots have been introduced. In this paper we propose a meal assistance robot that is controlled using user intentions based on Electroencephalography (EEG) signals while incorporating camera-based automatic mouth position tracking and mouth open detection systems. In the proposed system, users select any solid food item that they desire to consume from three different containers by looking at corresponding flickering LED matrices. User intentions are identified through EEG signals using a Steady State Visual Evoked Potentials (SSVEP) based intention detection method. Initial motion commands for scooping food from the containers are generated and sent to the meal assistance robot from this first stage. At the second stage, a camera-based mouth position tracking method is proposed for automatically detecting the user's mouth position and thereby moving the spoon or end-effector of the meal assistance robot towards the mouth of the user. This method is capable of automatically tracking the mouth position of users irrespective of their individual body differences and seating positions. A mouth open/closed recognition method is implemented at the final stage in order to feed food to the users when they desire consumption, indicated by the opening/closing of their mouth. A set of experiments were carried out with healthy subjects to validate the proposed system and results are here presented.

AB - A Meal Assistance Robot is an assistive device that is used to aid individuals who cannot independently direct food to their mouths for consuming. For individuals who undergo loss of upper limb functions due to amputations, spinal cord injuries or cerebral palsy, self-feeding can be impossible, and to assist such individuals in regaining their independence meal assistance robots have been introduced. In this paper we propose a meal assistance robot that is controlled using user intentions based on Electroencephalography (EEG) signals while incorporating camera-based automatic mouth position tracking and mouth open detection systems. In the proposed system, users select any solid food item that they desire to consume from three different containers by looking at corresponding flickering LED matrices. User intentions are identified through EEG signals using a Steady State Visual Evoked Potentials (SSVEP) based intention detection method. Initial motion commands for scooping food from the containers are generated and sent to the meal assistance robot from this first stage. At the second stage, a camera-based mouth position tracking method is proposed for automatically detecting the user's mouth position and thereby moving the spoon or end-effector of the meal assistance robot towards the mouth of the user. This method is capable of automatically tracking the mouth position of users irrespective of their individual body differences and seating positions. A mouth open/closed recognition method is implemented at the final stage in order to feed food to the users when they desire consumption, indicated by the opening/closing of their mouth. A set of experiments were carried out with healthy subjects to validate the proposed system and results are here presented.

UR - http://www.scopus.com/inward/record.url?scp=85027974633&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85027974633&partnerID=8YFLogxK

U2 - 10.1109/ICRA.2017.7989208

DO - 10.1109/ICRA.2017.7989208

M3 - Conference contribution

T3 - Proceedings - IEEE International Conference on Robotics and Automation

SP - 1760

EP - 1765

BT - ICRA 2017 - IEEE International Conference on Robotics and Automation

PB - Institute of Electrical and Electronics Engineers Inc.

ER -