• Ever wondered if robotics can help in our everyday life just as they’ve been applied in industrial applications? Well, they’ve actually been helping in a whole lot of ways especially for people who need special assistance — more like the blind and disabled. Today, a team of engineers from the University of Antwerp in the Netherlands has designed a 3D printed arm that can address the needs of the deaf and hearing impaired — the arm serves as a sign language translator for deaf people.

As we know, sign language interpreters are not always available when these people need them — the hearing impaired will definitely need their help in courts, classrooms and at home. Since sign language interpreters are in short supply, this research team have taken it upon themselves to build a low-cost automated system that can help translate text into sign language. Erwin Smet, a robotics teacher, says, “A deaf person who needs to appear in court, a deaf person following a lesson in a classroom somewhere. These are all circumstances where a deaf person needs a sign language interpreter, but where often such an interpreter is not readily available. This is where a low-cost option, can offer a solution.”

Stijn Huys, another engineer on the project, added, “I was talking to friends about the shortage of sign language interpreters in Belgium, especially in Flanders for the Flemish sign language. We wanted to do something about it. I also wanted to work on robotics for my masters, so we combined the two.”
It’s important to note that there have been several different technological attempts to bridge the gap hearing and deaf communities including tablet-like devices that translate hand gestures into audio or text, smart gloves and more.

The team’s solution, Project ASLAN (Antwerp’s Sign Language Actuating Node) is designed to perform sign language letters and numbers. In other words, it translates spoken words or text into sign language. In its present form, the robotic arm is connected to a computer which is then linked to a network. Users only have to connect to the local network and send text messages to ASLAN — once this is done, the hand will start interpreting the messages with signs. At the moment, the robotic arm works with an alphabet system known as fingerspelling — the system communicates each letter through a separate gesture.

For the most part, the robotic hand is made up of 25 3D-printed parts coupled with 16 servo motors, an Arduino Due microcomputer, three motor controllers and few other electronic components. ASLAN is taught gestures with the help of a special glove and the team working towards recognizing them via webcam as well.

At this point, the team has developed just one hand — this only means that two-hand gestures, as well as the cues from facial expressions that make sign language better, are not possible yet. The good news is, the team is looking to create a second coordinating arm and an emotive robotic face to enrich the entire process.

Now, this robotic arm wasn’t created to make interpreters irrelevant to the people — it’s not entirely possible to replicate their capabilities. It’s designed to ensure that there’s always an alternative when sign language interpreters are unavailable to offer their services. It can also come in handy when it comes to teaching sign language — it’s very unlikely for a robot to get tired of repeating hand gestures for you to learn.

The ASLAN project is actually a commendable goal which could be better than using a virtual hand on a computer. It’s more like taking an assistive technology off the screen and placing it in the real world. The robot can be viewed from a variety of angles and even emulates a real hand through its physical presence. Moreover, the team is planning to make the design open source so it can be can be used by anyone and everyone.