Google X lab announces on Augmented reality glasses

Google has announced project Glass.

NY Times – Google-made augmented reality glasses that will be able to stream information to the wearer’s eyeballs in real time. They are expected “to cost around the price of current smartphones,” or $250 to $600. The Google glasses said they would be Android-based, and will include a small screen that will sit a few inches from someone’s eye. They will also have a 3G or 4G data connection and a number of sensors including motion and GPS.

These glasses have a front-facing camera used to gather information and could aid in augmented reality apps. It will also take pictures. The spied prototype has a flash —perhaps for help at night, or maybe it is just a way to take better photos. The camera is extremely small and likely only a few megapixels.

The navigation system currently used is a head tilting-to scroll and click. We are told it is very quick to learn and once the user is adept at navigation, it becomes second nature and almost indistinguishable to outside users.

I/O on the glasses will also include voice input and output, and we are told the CPU/RAM/storage hardware is near the equivalent of a generation-old Android smartphone. As a guess, we would speculate something like 1GHz ARM A8, 256MB RAM and 8GB of storage? In any case, it will also function as a smartphone.

Technology Review – Mark Changizi, an evolutionary neurobiologist and author of The Vision Revolution, is optimistic about project Glass

“The graphics are not going to look like they’re floating out in front of you, because it’s only being displayed to one eye,” Changizi explains. Instead, the experience would be similar to “seeing through” the image of your own nose, which hovers semi-transparently in the periphery of our visual field at all times (even though we rarely pay attention to it). “Having non-corresponding images coming from each eye is actually something we are very much used to already,” Changizi says. “It’s not uncomfortable.” So Google’s one-eyed screen design seems biologically savvy.

Then again, Changizi continues, “they’re presenting text to you, and in order to discern that kind of detail, you need to have it in front of your fovea”—the tiny, central part of your visual field. “That’s typically *not* where we’re used to ‘seeing through’ parts of our own bodies, like our noses.” Which means that those crisp, instant-message-like alerts won’t be as simple to render as the video makes it seem.

“The more natural place to put [these interface elements], especially if it’s not text, is in the parts of your visual field where your face-parts already are,” Changizi says. This could be in the left and right periphery, where the ghost-image of your nose resides, or in the upper or bottom edges of your visual field, where you can see your cheeks when you smile or your brow when you frown. “There could be very broad geometrical or textural patterns that you could perceive vividly without having to literally ‘look at’ them,” he says. This would also make the digital overlays “feel like part of your own body,” rather than “pasted on” over the real world in an artificial or disorienting way. That experience might feel more like “sensing” the digital interface semi-subconsciously, rather than looking at it directly as if it were an iPhone screen.

A Google employee (who preferred not to be identified) confirmed to Technology Review that “the team is involved in many kinds of experimentation, and some of that will involve outdoor testing,” but wouldn’t provide any details about what that testing has revealed about the perceptual aspects of the user experience. Clearly, the concept video is meant to convey the basic premise of Project Glasses, rather than render the user experience in a biologically accurate way.

But if Google really does plan to bring this product to market before the end of 2012, as it has claimed, it is exactly these psychological and phenomenological details that will have to be examined closely.

For his part, Changizi is optimistic. “Right now we have everyone walking around focusing their vision on tiny four-inch screens held in their hands, bumping into each other,” he says. “Whatever Google does with Project Glass, it’ll surely be an improvement over that.”

If you liked this article, please give it a quick review on ycombinator or StumbleUpon. Thanks