RSS Our Blog

Contact

[email protected]
+12045156086

Prototype AR App Translates Sign Language In Real Time

AR continues to break down the language barrier as this NYU student introduces a new form of augmented sign language translation.

 

If the last decade of sci-fi movies have taught us anything it’s that the near future is going to be full of futuristic language translating earbuds, implants, or some other form of high-tech speech conversion.

 

And while we still might be a couple years off from hands-free, zero effort translation, three NYU students have managed to utilize current smartphone technology to bring real-time translation to one of the most overlooked forms of communication: sign language.

 

Verizon Connected Futures III – SOCIAL VR_AR – ARSL Presentation Final – YouTube

 

Developed by Heng Li, Jacky Chen, and Mingfei Huang, the trio of computer science students working at the NYU Tandon School of Engineering’s “Connected Futures Prototyping and Talent Development” program, created the ASLR app. The app combines computer vision with AR to capture specific sign language hand gestures performed in front of the camera and provide a real-time translation in the native language of the user.

 

The prototype can also convert spoken word into sign language, recording audio through a smartphone’s microphone and displaying a detailed animated image of the respective hand gesture on the other.

“Although we know that we just explored the tip of iceberg for this long time problem for the global sign community, we would like to continue to interview our end users for their insights,” Li told Next Reality. “[We would also like to] interview experts in the field to discover what other emerging technologies and techniques can help on top of computer vision.”

 

Image by Hideaki Heng Lee/YouTube

 

The program responsible for Heng Li, Jacky Chen and Mingfei Huang’s intuitive project, developed in partnership with NYU Media Lab and Verizon, will invest in over a dozen virtual reality, augmented reality and artificial intelligence projects this year alone. This includes everything from an AR app that assists with hardware issues and provides technical references to VR training exercises for those suffering from a social anxiety disorder.

 

 

“We make magic when we pair leading students with outstanding mentors in the Envrmnt team at our AR/VR lab,” said Christian Egeler, Director of XR Product Development for Envrmnt, Verizon’s platform for Extended Reality solutions in a statement. “We discover the next generation of talent when we engage them in leading edge projects in real time, building the technologies of tomorrow.”

 

“NYC Media Lab is grateful for the opportunity to connect Verizon with technically and creatively talented faculty and students across NYC’s universities” stated Justin Hendrix, Executive Director of the NYC Media Lab. “We are thrilled to continue to advance prototyping in virtual and augmented reality and artificial intelligence. These themes continue to be key areas of focus for NYC Media Lab, especially with the development of the first publicly funded VR/AR Center, in which the Lab is developing in conjunction with NYU Tandon School of Engineering.”

 

No word yet on whether or not we’ll be seeing ASLR on Google Play or the App Store anytime in the near future, but the team has confirmed plans to pursue a commercial release at some point.

 

*This article is written and published on VRScout

user-gravatar
Erick Tran
No Comments

Post a Comment

Comment
Name
Email
Website

This site uses Akismet to reduce spam. Learn how your comment data is processed.