Cyrillic manual alphabet recognition in RGB and RGB-D data for sign language interpreting robotic system (SLIRS)

Link:
Autor/in:
Erscheinungsjahr:
2017
Medientyp:
Text
Schlagworte:
  • Gesture recognition
  • Human computer interaction
  • Dynamic hand
  • Algorithms
  • Computer Vision
  • Models
  • Gesture recognition
  • Human computer interaction
  • Dynamic hand
  • Algorithms
  • Computer Vision
  • Models
Beschreibung:
  • Deaf-mute communities around the world experience a need in effective human-robot interaction system that would act as an interpreter in public places such as banks, hospitals, or police stations. The focus of this work is to address the challenges presented to hearing-impaired people by developing an interpreting robotic system required for effective communication in public places. To this end, we utilize a previously developed neural network-based learning architecture to recognize Cyrillic manual alphabet, which is used for fingerspelling in Kazakhstan. In order to train and test the performance of the recognition system, we collected four datasets comprising of static and motion RGB and RGB-D data of 33 manual gestures. After applying them to standard machine learning algorithms as well as to our previously developed learning-based method, we achieved an average accuracy of 93% for a complete alphabet recognition by modeling motion depth data.
Lizenz:
  • info:eu-repo/semantics/restrictedAccess
Quellsystem:
Forschungsinformationssystem der UHH

Interne Metadaten
Quelldatensatz
oai:www.edit.fis.uni-hamburg.de:publications/19916788-9977-4f69-a025-deaa9a06e4de