Skip to main content

Opening Planetarium Learning to the Deaf

Deaf-1024x640.jpg
BYU Professor Mike Jones and his students have developed a system to project the sign language narration of planetarium shows onto several types of glasses — including Google Glass.

Ordinarily, deaf students are left in the dark when they visit a planetarium.

With the lights off, they can’t see the ASL interpreter who narrates their tour of outer space. With the lights on, they can’t see the constellations of stars projected overhead.

That’s why a group at Brigham Young University launched the “Signglasses” project. Professor Mike Jones and his students have developed a system to project the sign language narration onto several types of glasses — including Google Glass.

The project is personal for Tyler Foulger and a few other student researchers because they were born deaf.

“My favorite part of the project is conducting experiments with deaf children in the planetarium,” Tyler wrote. “They get to try on the glasses and watch a movie with an interpreter on the screen of the glasses. They’re always thrilled and intrigued with what they’ve experienced. It makes me feel like what we are doing is worthwhile.”

By sheer coincidence, the only two deaf students to ever take Professor Jones’ computer science class — Kei Ikeda and David Hampton — signed up just as the National Science Foundation funded Jones’ signglasses research. Soon after the Sorenson Impact Foundation provided funding to expand the scope of the project.

“Having a group of students who are fluent in sign language here at the university has been huge,” Jones said. “We got connected into that community of fluent sign language students and that opened a lot of doors for us.”

The BYU team tests the system during field trip visits by high school students at Jean Messieu School for the Deaf. One finding from the tests is that the signer should be displayed in the center of one lens. That surprised the researchers, who assumed there would be a preference to have video displayed at the top, like the way Google Glass normally does it. Deaf participants preferred to look straight through the signer when they returned their focus to the planetarium show.

The potential for this technology goes beyond planetarium shows. The team is also working with researchers at Georgia Tech to explore signglasses as a literacy tool.

“One idea is when you’re reading a book and come across a word that you don’t understand, you point at it, push a button to take a picture, some software figures out what word you’re pointing at and then sends the word to a dictionary and the dictionary sends a video definition back,” Jones said.

For the rest of the story, visit BYU News.