This is an archived article that was published on sltrib.com in 2014, and information in the article may be outdated. It is provided only for personal research purposes and may not be reprinted.

When kids see planetarium shows for the first time, they're often overwhelmed by the vastness of space and dazzled by shimmering stars and whimsical constellations.

As a deaf person, however, Tyler Foulger's experiences at planetariums haven't exactly been magical.

"Even with an interpreter in the room, it is still difficult to have a good experience since I have to continually switch my attention back and forth between the interpreter and the planetarium show, causing me to miss parts of the narration-show," Foulger, now a senior at Brigham Young University, said in an email.

To help deaf children get more out of similar experiences, a team at BYU has developed a system to project sign-language narration onto some types of glasses, including Google Glass. They're calling it "signglasses."

Foulger was excited to work on the project with other students and BYU computer science professor Michael Jones. With the technology, students can watch planetarium shows and a sign-language interpreter simultaneously.

"The idea is that when a deaf child is in a classroom or on a field trip, they can struggle to see the sign language and what's being talked about," Jones said. "The deaf child can either look at what's being talked about or what's being said, but if they're not in the same place, it's difficult to look at both."

Jones said the idea for the technology came from the planetarium director at BYU, Jeannette Lawler. She would have to intermittently raise and lower the planetarium lights to let an interpreter sign when deaf children visited, he explained. Including written captions on the screen also wouldn't be ideal when dealing with children, he said.

Jones and his team started testing the technology about a year and a half ago with children from Jean Massieu School of the Deaf in Millcreek.

The team turned up some surprising findings, including that the children preferred the interpreter be projected into the center of the glasses rather than off to the side. The kids preferred to look through the interpreter so they didn't have to move their eyes back and forth or refocus.

David Oyler, a science teacher at the school, said his students enjoyed contributing to the technology's development by testing it out and providing feedback. Plus, it enhanced their experience at the planetarium.

"It meant the world," Oyler said. "A lot of them don't have any opportunities like this. As a science teacher, it excites me to no end to be able to provide these opportunities."

Jones plans to present a paper with the team's findings at an Interaction Design and Children conference in Denmark later this month.

He said he also plans to continue working on and studying the technology with a focus on how it affects learning and comprehension.

Jones said he believes the technology could also help deaf students when on field trips to museums or historical sites. It could even help in the classroom when they are trying to perform hands-on activities such as dissecting a frog or an owl pellet, he said.

Foulger said it can otherwise be tough, at times, for deaf students to keep up with their hearing peers when they have to constantly switch their attention back and forth to understand what's happening.

"It will allow them to save valuable time that is usually lost when continually switching his or her attention between a real interpreter and the teacher," Foulger said of signglasses. "It probably still won't provide completely equal ground, but I believe it will come pretty close to bridging the gap between deaf and hearing students."

Study co-authors include Lawler; Eric Hintz, a BYU physics and astronomy professor; Nathan Bench, a BYU post-doctoral fellow in computer science; Fred Mangrubang, of Gallaudet University; and Mallory Trullender of Mantua Elementary School in Fairfax, Va.