- Get link
- X
- Other Apps
How Augmented Reality Will Work
What's wrong with this picture? Not much, actually. By 2010, most everybody who now carries a cell phone in their pocket will wear cell phone eyeglasses that will be their interface to augmented reality (AR). These glasses will overlay computer information on our field of view -- and it will look something like what we see in the picture above.
Notice that the overlayed image is monochrome green. If consumers will accept monochrome for their AR experience, we are much closer to achieving this than my forecast of 2010. But for now, let's put this 4 1/2 years out, and give the network providers and software developers (and light source manufacturers) plenty of time to have the infrastructure primed and ready for tens of millions of simultaneous AR users.
So far no one has challenged me on the emergence of augmented reality as a major market force sometime in the next few years. And it's not just little old me; Gartner's forecast says by 2014, more than 30 percent of mobile workers will be using augmented reality (from this pretty interesting article). I'm not sure where they get 2014, or 30 percent -- but they've got the right idea. This thing is big, and it's coming, and we've all just got to get used to it. Some of us may decide to anticipate the trend entirely and take equity positions in those companies poised to benefit the most.
What are the types of AR services we are most likely to see? It's not too hard to forecast some of these initial applications. Just like the mock-up AR image above, it is all about location. It's about knowing where you are, and knowing what you're looking at. So we can say, you are heading North on Haven St., approaching Sharon's Bridal. Additional information about the shops will be included -- the example lists the date of establishment, but I don't think people are going to care about that so much. I think it will be about tying the inventory of a store to a personalized watch list of favorite products, a shopping list, a 'to do' list, and probably a computer generated list of 'things you might like' a la Amazon.com. Other types of ancillary info may be how long a wait to be seated at restaurants, the menus and prices, reviews, etc.
There will also be a news crawler like we're all used to on CNN -- but it will be customized to display only items of interest. I'll be sure to be informed of the latest free agent signings by the Patriots -- and of course any Microvision news as it happens! Probably the name of whatever song is playing on your terabyte iPod will be displayed as well. And it may very well be an Apple logo on your AR eyeglasses themselves.
The advantages are obvious. Wherever I find myself, I'll know where I am. I'll have a layer of information (in IT, we call this a 'metadata layer' -- seems like an appropriate term for an AR information overlay, too) tailored to my needs that tells me everything I could need to know about where I am, and helps me do whatever it is that I'm doing faster, with less time wasted. People will be more likely to explore new places and new parts of town if they have the same detalied information as lifelong locals. There will be massive efficiency gains in the execution of daily tasks -- on the scale of tens of millions of users, these gains will result in measurable economic impact. Not to mention the value of the subscription fees to companies like Verizon and Sprint who will run the AR networks that handle your metadata layer requests. The opportunities for video gaming to move from the living room, to the handheld, and into the metadata layer could be measured in the billions of dollars.
It's not hard to see all this coming, with Human Pac Man, GPS enabled phones and software, and all the other rapidly emerging trends in wireless devices.
Personalized mobile experiences, based on who you are, where you are, what you're doing. Information relevant to you, available whereever you go. It's coming. And Microvision will provide the display engine that powers these tens of millions of AR glasses due to their unique combination of capabilities: see-through, high-brightness and resolution, low power consumption, and no physical screen. Just data and information from your personalized metadata layer that hovers over the various places and objects of interest you see in your daily life.
Comments
Post a Comment