Last week I looked into how American Sign Language can be used in coding. This, in combination with our in-class discussions got me wondering about how artificial intelligence has incorporated ASL. I will admit that I had big dreams, but low hopes when I set out to find out more. I was hopeful that there was an AI application out there that allows people to practice their ability to communicate using ASL. I really want to have more practice having conversations before I attempt signing with someone who uses ASL as their first language. Based on the videos that I have seen, I am not quite skilled enough to understand what someone is saying in ASL when it is being signed at a fast pace. However, I did not really believe that an application like this exists yet.
I started by simply typing “AI ASL” into Google and wow was I surprised by what I was able to find. The first website that appeared is called
Signapse and it is one of the coolest things that I have seen on my journey so far. It is an AI sign language interpreter for both American Sign language and British Sign Language! It has the ability to interpret webpages and videos. It also seems that transportation companies can collaborate with the company to make their webpages and public spaces more accessible by translating important information and announcements. I really love the idea of this technology! First and foremost, as I have said in many of my previous posts, sign language is a language separate from English meaning that writing something down does not necessarily mean that it is accessible to people that are deaf or hard of hearing. This technology takes away the need to use captions or read webpages in order to engage in the online community. I can also see this being incredibly helpful in a classroom environment. There is often a lot of reading to be done throughout the course of the year so it could be life changing for students that use sign language to be able to access class material in their own language. I can imagine that many of these students would feel that they are truly being included in the classroom.
So far it does not appear to have the capability to interpret conversations between a person that is deaf or hard of hearing and a person that is hearing. That said, I really think that in the next couple of years this technology could develop to allow for this. This would make the world so much more accessible for people that are a part of the deaf or hard of hearing community. I could also see this- in the more distant future- allowing for greater inclusion of students that use sign to communicate in the classroom. I know there are schools that use sign language as the dominant form of communication and there are students and families that would prefer to attend these school. However, there are families that cannot make the move to one of these schools. With this technology these students could easily be enrolled in an English school.
If you or someone you know is a part of the deaf or hard of hearing community please ask them about this site. I would love to know what someone that knows more about sign language than I do thinks about this technology!