Glove-TalkII - An Adaptive Interface that Maps Hand Gestures to Speech

Glove-TalkII is a system that translates hand gestures to speech through an
adaptive interface. Hand gestures are mapped continuously to ten control
parpameters of a parallel formant speech synthesizer. The mapping allows the
hand to act as an artificial vocal tract that produces speech in real time.
This gives an unlimited vocabulary in addition to direct control of
fundamental frequency and volume. Currently, the best version of Glove-TalkII
uses several input devices (including a Cyberglove, a ContactGlove, a three
space tracker, and a foot pedal), a parallel formant speech synthesizer, and
three neural networks. One subject has trained to speak intelligibly with
Glove-TalkII. He speaks slowly but with far more natural sounding pitch
variations than a text-to-speech syntesizer. 

Links:
Glove-TalkII Website
http://hct.ece.ubc.ca/research/glovetalk2/index.html

Machine Gesture and Sign Language Recognition
http://www.cse.unsw.edu.au/~waleed/gsl-rec/

Pausch and Davidson's CANDY system
http://www.cse.unsw.edu.au/~waleed/thesis/node48.html

Glove-Talk: A neural network interface between a data-glove and a speech
synthesizer (1993)
http://citeseer.ist.psu.edu/fels93glovetalk.html
