A team of researchers has come up with a smartphone application that projects a magnified smartphone screen to Google Glass, which users can navigate using head movements to view a corresponding portion of the magnified screen.
They have shown that the technology can potentially benefit low vision users, many of whom find the smartphone’s built-in zoom feature to be difficult to use due to the loss of context.
“When people with low visual acuity zoom in on their smartphones, they see only a small portion of the screen, and it’s difficult for them to navigate around, they don’t know whether the current position is in the center of the screen or in the corner of the screen,” said senior author Gang Luo.
Luo added that this application transfers the image of smartphone screens to Google Glass and allows users to control the portion of the screen they see by moving their heads to scan, which gives them a very good sense of orientation.
In an evaluation of their new technology, the researchers observed two groups of research subjects, one group that used the head-motion Google Glass application and the other using the built-in zoom feature on a smartphone and measured the time it took for them to complete certain tasks. The researchers showed that the head-based navigation method reduced the average trial time compared to conventional manual scrolling by about 28 percent.
As next steps for the project, the researchers would like to incorporate more gestures on the Google Glass to interact with smartphones. They would also like to study the effectiveness of head motion based navigation compared to other commonly used smartphone accessibility features, such as voice based navigation.