Tactile aids have been around for decades, but MIT's Sensory Communication Group is looking to improve them by creating software that will offer cues to lip-readers. These tactile devices are designed to work with smart phones by translating sound waves into vibrations that can be felt by the skin. Users are able to distinguish between patterns associated with different sound frequencies like the nuances between saying “p” and “b”.
The idea for this project came from a communication method used with deaf-blind people where hands are held to the person’s face while they are talking. That allows the deaf-blind person to feel the vibrations on the face and neck.
The National Institute on Deafness and Other Communication Disorders is funding the MIT research.