Hand Gesture Classification for Human-Robot Interaction in Rock-Paper-Scissors Game

Description/Abstract/Artist Statement

Human-Robot Interaction (HRI) is a field dedicated to understanding, designing, and evaluating robotic systems for use by or with humans. The NAO robot, a humanoid robot developed by SoftBank Robotics, has emerged as a powerful platform for HRI and has become a standard in education and research. NAO has 25 degrees of freedom which enable it to move and perform actions. The NAO robot is enabled with multiple sensors to perceive its environment including 4 directional microphones and speakers to interact with humans. Among these sensors, the NAO is equipped with two 2D video cameras which enable an important component of an HRI system: the robot’s perception of human actions to make an appropriate response. In this project, we demonstrate shared perception and action in HRI between a human actor and the NAO humanoid robot via a rock-paper-scissors game. Python is used to program the robot to create animations and speak to the human actor in this project. An in-house database of rock-paper-scissors hand gestures is collected, cleaned, and curated to serve as ground truth for the rock-paper-scissors gesture recognition task. Transfer learning is used to fine-tune a deep neural network model pre-trained on hand gesture classification for recognition of rock-paper-scissors hand gestures. The platform is aimed at K-12 outreach to allow the K-12 students to grow their interest in STEM and understanding of engineering principles. This project has the potential for broader impacts in the facilitation of human-robot collaboration and the development of HRI systems for children.

Presenting Author Name/s

Elija Bullock

Faculty Advisor/Mentor

Khan M. Iftekharuddin

College Affiliation

College of Engineering & Technology (Batten)

Presentation Type

Poster

Disciplines

Computer Engineering | Robotics

Session Title

Poster Session

Location

Learning Commons @ Perry Library

Start Date

3-19-2022 9:00 AM

End Date

3-19-2022 11:00 AM

This document is currently not available here.

Share

COinS
 
Mar 19th, 9:00 AM Mar 19th, 11:00 AM

Hand Gesture Classification for Human-Robot Interaction in Rock-Paper-Scissors Game

Learning Commons @ Perry Library

Human-Robot Interaction (HRI) is a field dedicated to understanding, designing, and evaluating robotic systems for use by or with humans. The NAO robot, a humanoid robot developed by SoftBank Robotics, has emerged as a powerful platform for HRI and has become a standard in education and research. NAO has 25 degrees of freedom which enable it to move and perform actions. The NAO robot is enabled with multiple sensors to perceive its environment including 4 directional microphones and speakers to interact with humans. Among these sensors, the NAO is equipped with two 2D video cameras which enable an important component of an HRI system: the robot’s perception of human actions to make an appropriate response. In this project, we demonstrate shared perception and action in HRI between a human actor and the NAO humanoid robot via a rock-paper-scissors game. Python is used to program the robot to create animations and speak to the human actor in this project. An in-house database of rock-paper-scissors hand gestures is collected, cleaned, and curated to serve as ground truth for the rock-paper-scissors gesture recognition task. Transfer learning is used to fine-tune a deep neural network model pre-trained on hand gesture classification for recognition of rock-paper-scissors hand gestures. The platform is aimed at K-12 outreach to allow the K-12 students to grow their interest in STEM and understanding of engineering principles. This project has the potential for broader impacts in the facilitation of human-robot collaboration and the development of HRI systems for children.