Sign language animatronics still in its infancy
Published 16 March 2017 by Annick Rivoire
Turning voice into sign language with hands printed in 3D; such is the mission of Signbot, a prototype designed during a hackathon at McGill university in Quebec.
Signbot and its puppet theater is a prototype designed during the McHacks class in only 24 hours by four IT students: Alex Foley, Clive Chan, Wilson Wu and Colin Daly, all from the university of Waterloo in Ontario.
Signbot, final demo:
From gesture to speech
As a first step, Signbot relies on the telepresence of hands, a classic in robotics that consists in having the gestures of the operator imitated by a machine. With the help of a Kinect or in this case a Leap Motion sensor, associated to two mechanized hands. But at this stage, one had to know sign language. The students established a small ASL (American Sign Language) sign library and developed voice recognition. In front of the jury, a hand from the Signbot was able to produce the “hello” it had just been prompted. A POC (Proof of concept) and a victory on top.
Two left hands
To achieve their objectives, the four students had to juggle with many disciplines. The project submitted to Github got reports on Devpost and Hackaday but it was on Medium that Alex Foley told the story of their challenges. As a preamble, the hands and their fourteen servomotors, nylon lines…and a mistake that nearly cost them the victory: Clive Chan, in charge of printing the phalanxes, came out with two left hands! The right hand saw the light of day at the very last moment, a little more fragile. Alex Foley then had to fight with the software platform Node.JS to convey the movements of the Leap Motion API to the servomotors, and realized near the end that the two Arduino Mega did not support the right number of servomotors.
Signbot and the Leap Motion API, demo:
The designers are rather proud to have weathered the difficulties of voice synthesis thanks to the voice recognition development kit (devkit) Nuance, that they discovered on that occasion, and manipulated with plenty of trigonometric calculations for sound processing. During the presentation to the jury, the introduction borrowed from the World Federation of the Deaf was able to convince the jury of the merits of Signbot: “Sign language is the first language for 70 million people.” What would those principally concerned have thought about it?
A sign library that was lacking
We contacted Sj Rideaf, tireless French actor of the rapprochement between deaf and hearing people and militant for the accessibility in fablabs. He refrained from giving his opinion on the impact of the prototype, but recalled the merits of his Sign library, 2016 social innovation prize, that compiles in open source sign language in the form of digital vectors. The Waterloo university students confirmed that this library would have been a great help if they’d known about it. They also recognize that one should miniaturize the concept for domestic use and of course increase the vocabulary of animatronics.
In the golden age of sensors, Signbot is not the only prototype on the niche of sign language. In April 2016, two students from the university of Washington also won a prize with the opposite concept, Signaloud; in this instance from signs to speech, thanks to gloves equipped with sensors. In 2014, Motion Savvy designed Uni, an application that steps on the toes of Signbot with its bi-directional translation on a tablet.
The Motion Savvy Uni application, demo (2014):
The maker students of Signbot have not yet been contacted by associations of deaf people. Their prototype remains an excellent hackathon performance. Could it have a future if it was implemented in a robot able to talk to the 70 million deaf and partially deaf people? According to the students, “it would require a lot of work but it would be really cool to see that.” A feature that would most certainly appeal to C-3PO, the not so resourceful translator robot from Star Wars.
The Signbot Github