Coordination and control of rescue robots is a hard task, especially in harsh and hazardous environments, that potentially limit both the mobility and endurance of the robots and the safety of a human interaction. In this paper we provide a fast, flexible and cost effective framework for human-robot interaction specifically designed for the on-field interaction of human operators and robots, using as input a cheap Microsoft Kinect camera. The proposed architecture is based on a quick and cost effective gesture recognition algorithm developed in LabView and integrated into a ROS based communication framework that allows to decouple the mobile robot and the user terminal. The proposed architecture has been tested simulating harsh conditions by considering darkness, smoke, crowds, and user wearing firefighter uniforms

On field gesture-based human-robot interface for emergency responders

Oliva G.;Setola R;
2013-01-01

Abstract

Coordination and control of rescue robots is a hard task, especially in harsh and hazardous environments, that potentially limit both the mobility and endurance of the robots and the safety of a human interaction. In this paper we provide a fast, flexible and cost effective framework for human-robot interaction specifically designed for the on-field interaction of human operators and robots, using as input a cheap Microsoft Kinect camera. The proposed architecture is based on a quick and cost effective gesture recognition algorithm developed in LabView and integrated into a ROS based communication framework that allows to decouple the mobile robot and the user terminal. The proposed architecture has been tested simulating harsh conditions by considering darkness, smoke, crowds, and user wearing firefighter uniforms
2013
978-1-4799-0879-0
Depth Sensor; Harsh Environment; Human-Robot Interface
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.12610/16574
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? 0
social impact