In The First, a new show from Hulu about our first mission to Mars, the actor Sean Penn wears very cool glasses. Looking like slim-rimmed Ray Bans, his shades shade to provide him with an immersive high definition screen. Penn’s glasses create huge video walls, seemingly floating in the air before him. The video walls can be used to stream High Definition video, CAD plans of his Mars bound vehicle, and footage that he can share at will with others. This rich, immersive AR experience is all driven by voice and gesture (of course).
Of all the tech on display in the show, these glasses seem the most far-fetched. It’s set just a few years in the future after all. Happy to suspend disbelief to accept all electric Range Rovers, Mars based bio-spheres, and holographic skype, we recoil at the simplicity and slender dimensions of the VR Ray Bans. Impossible! Packing an HD display into slim rims, along with all the processing power required to generate AR or VR is constrained by basic physics. The Laws of Optics wouldn’t allow it – no matter how radical the breakthroughs in contemporary materials science.
But turns out Sean Penn may be right. (Is Sean Penn ever really wrong?). A paper that just popped up in the Journal Optica (a fairly classy peer-reviewed journal not known for sci-fi fantasy) scientists at the CER centre in Grenoble France (where the French military do a lot of their secret, hush hush work) revealed a concept that could makes Sean’s fantasy reality.
The VR, AR and MR systems currently available suffer from restrictions on field view – there is only so much one can pack into a typical near-eye frame. Despite strides in ‘algo’ based emulation, headsets have to be either a bulky smart helmet – in order to provide an image that covers your full field of view, or if slim, focus on a tight ‘windows’ of activity where all the action happens. Wear a helmet and see everything, or wear eye glasses and get AR through a toilet role.
The team in France have come up with a breakthrough idea to blow those restrictions away. It centres on how to project images directly on to your retina. By projecting light into your eyes, rather than out to a ‘screen’ that floats in front of you they have provided a path to viewing artificial reality across all you are looking at. Without the helmet. They have ditched the idea of an intermediary lens that angles light into the eye (that other labs have been playing with) and instead have used some clever properties of laser light itself, and the eye’s ability to naturally focus to create holographic images from phased multi-point laser arrays. Imagine a small grid or red and green laser diodes trapped within a translucent optic film (like a smart glass); that fires lasers into your retina at a phase rate that actually generates 3 dimensions. Yes – holograms. Yes – lasers directly into your eyes (worth it).
From an investment perspective, these sorts of innovations provide interesting lines of research. If one accepts that near vision AR/VR/MR is possible at a consumer level and will find general acceptance in the next five years, where should the smart money coalesce now.
When Steve Jobs stood up at that first iPhone release and swiped to unlock, the crowd gasped. Imagine the roar if, with a wave, the back wall fell away and a full holographic, immersive reality ride kicked in. Knowing the revolution is coming, allows us to maybe invest in ancillary technologies and platforms that will empower and profit in the new interactive reality. One I like is Tagspace, smart antipodeans with a good handle on a tagging technology for AR. Hypersurfaces allows that interaction with, well anything really, and those fancy VR glasses.
The best move may simply be to sit back, relax, and imagine how your world may be better if, on command, the real could could drop away and be replaced with the virtual. While wearing nothing more cumbersome than a pair of Aviators.