Google recently unveiled its augmented reality glasses, under a project codenamed "Project Glass". This futuristic example of wearable computing is believed to be powered by Android software and said to include, according to a report in the Telegraph, a 3G or 4G data connection, motion sensors and GPS navigation.
The product has reportedly been developed by a group of researchers from the Google X Special Project Division; the same team that also developed the self-driving car and is now working on a space elevator.
To begin with, check out this short video, released by Google, which shows an individual wearing the glasses and taking photographs, checking the weather forecast, getting directions and placing a video call, all using voice-activated icons that appear over the user's field of vision. There were also photographs posted of people wearing glasses - except these are not really glasses, at least not in the sense we know. The device lacks a full lense and only have a small and rectangular piece of glass (a Heads Up Display) over the right eye.
Experts Speak on Project Glass
A report by Wired quoted two researchers - Pranav Mistry, an MIT Media Lab researcher (and one of the investors in the SixthSense wearable computing system) and Blair MacIntyre, the Director of Augmented Environments Lab at Georgia Tech - as criticising the video and Google's presentation of Project Glass.
According to Mistry, the small screen seen in the photographs could not possibly provide a similar experience to that seemingly offered by the product, as demonstrated in the above video.
"You could not do AR with a display like this. The small field of view, and placement off to the side, would result in an experience where the content is rarely on the display and hard to discover and interact with. But it is a fine size and structure for a small head-up display," MacIntyre also argued. Moreover, he also believes Google may have set the bar too high for itself.
"In one simple fake video, Google has created a level of over-hype and over-expectation that their hardware cannot possibly live up to," he added, continuing, "Some of what I find a little annoying about the video is that they staged all these things such that as when these notifications come to the middle of the screen, the person is looking at the thing it's referring too. Is it augmented realty, or is it location-based notifications? It is going to generate ideas in people and expectations that just might not match."
Meanwhile Mistry is of the opinion the glasses will not be seen in the market for at least another two years, even if Google is able to deliver the goods.
"Current Head-up displays (HUDs) utilise a fixed lens distance of two feet. For true augmented reality, the display would have to dynamically focus, which require additional hardware on the glasses to read your eye," explained Mistry.
Finally, Maclntyre brings another issue to light, one he has experienced during his research.
"It is difficult to create a transparent display that renders viewable overlays both indoors and outdoors. The brightness difference between inside your bright office and outside on a bright day is multiple orders of magnitude," he explained.