Robot helps Deaf-Blind People Feel Sign Language

Northeastern University. Photo by Alyssa tone/Northeastern University

Robot helps Deaf-Blind People Feel Sign Language

American Sign language (ASL) is a great tool for those who are deaf, but what if you can’t see the hand doing the signing?  Now there’s a version of ASL that is tactile. Pro-tactile ASL communicates entirely through touch.

Helen Keller learned to communicate as her teacher, Anne Sullivan, spelled out words using finger spelling — using signs for each letter on the blind/deaf person’s hand. This is one of the only ways deaf-blind people can have a conversation, but it involves a translator. The deaf-blind person is dependent on someone who knows how to communicate this way.

Samantha Johnson, a bioengineering graduate student at Northeastern University, has created a robotic arm that can product tactile sign language to help people who are both blind and deaf to be more independent. The goal for this device is to translate text into signs that the individual can feel.

Read the complete article here.

How To Qualify