- Google Glass left us wanting more, but we are starting to see some hope at the end of this tunnel and the enterprise is the first to benefit
- Research into AR glasses enhances eye-tracking and positional awareness bringing more precision within our field of view
- The path to progress, how the immerging technology is evolving and what the market currently has to offer.
If you have read our article on tracking in VR then you have a good grasp of the ins and outs of tracking using a Head Mounted Display (HMD), but when we shift over to Augmented Reality Glasses things take on a new dimension. Note that the concepts of 6DOF or 3DOF still remain the same and the type of tracking used can also be broken down into inside-out and outside-in.
What makes tracking in AR unique is that it presents a new level of interaction and with those, new possibilities. Previous versions of AR glasses implementation have been less than perfect but recent development has shown lots of promise.
Our current take on AR glasses
Current AR glasses like Google Glass, only present 2D images in a very limited field of view which isn’t very engaging but was a good start and proof of concept. But companies like ODG are taking this further. ODG calls their new R8 and R9 “Consumer AR Smartglasses,” though with prices at $1,000 and $1,800 and a field of view of 40 degrees and 50 degrees respectively, but not sure if everyone would agree with that characterization.
Regardless, the glasses do represent a firm step toward augmented reality and the consumer market, as both models are said to be equipped with optical-based inside-out positional tracking and are significantly cheaper than the R7 predecessor, which was priced at $2,750. It seems the company plans to keep whittling prices down as they look toward consumer adoption. Both smartglasses are based on Qualcomm’s powerful new Snapdragon 835 mobile processor, which is said to be well equipped for AR and VR use-cases.
The glasses will tap into Qualcomm’s Snapdragon VR SDK to achieve positional tracking capabilities; in the past, we’ve been impressed with the tracking of Qualcomm’s VR headset reference platform, and we hope to see the same positional tracking quality carry over to ODG’s new smart glasses. However, when asked how its positional tracking compared to that of HoloLens, they stated that the R8 and R9 weren’t built for the same level of tracking quality as HoloLens.
That leads us right into a quick look at Microsoft’s own offering into the AR Glasses market. The new Hololens 2 model more than doubles the field of view of the original and has reworked the way users interact with holograms. The new time-of-flight depth sensor enables you to directly manipulate AR objects in the way you would their real-world counterparts. Microsoft has also added eye-tracking sensors to make interactions feel even more natural, and iris recognition will enable users to log in without the need for passwords or other forms of security check.
The design of HoloLens itself has also been revamped, with a new mechanism for putting the device on that doesn’t require it to be adjusted first. Microsoft claims it has “accounted for the wide physiological variability in the size and shape of human heads by designing HoloLens 2 to comfortably adjust and fit almost anyone.”
The research being done
One obvious point to note amongst these two vendors is that they are concentrating more on key functions such as eye-tracking, positional awareness, and gesture controls. So it’s no surprise when researchers at Dartmouth College created a Battery-free eye-tracking glasses with even more realistic experience for augmented reality enthusiasts. The new technology improves player controls for gaming and allows for more accurate image displays. High power consumption and cost have kept eye trackers out of current augmented reality systems.
By using near-infrared lights and photodiodes, Dartmouth’s DartNets Lab has created an energy-efficient, wearable system that tracks rapid eye movements and allows hands-free input of system commands. The glasses, which can also help monitor human health, were introduced at MobiCom 2018 which took place in New Delhi, India.
According to the Dartmouth research team, existing wearable eye trackers fall short mainly because of the inability to match high tracking performance with low energy consumption. Most trackers use cameras to capture eye images, requiring intensive image processing and resulting in high costs and the need for clunky external battery packs. “We took a minimalist approach that really pays off in power use and form factor,” said Tianxing Li, a Ph.D. student at Dartmouth and author of the research paper. “The new system opens a wide range of uses for eye-tracking applications.”. To make the Dartmouth system work, researchers needed to detect the trajectory, velocity, and acceleration of the eye’s pupil without cameras. Near-infrared lights are used to illuminate the eye from various directions while photodiodes sense patterns of reflected light.
Those reflections are used to infer the pupil’s position and diameter in real-time through a lightweight algorithm based on supervised learning.
But before we get caught up in the new research, let’s not forget that Apple still has a piece in this pie. At least that was the case until rumors recently came out stating that Apple has reportedly killed its AR glasses due to alleged technical problems, but this is actually a sign of the product design troubles that have been brewing at Apple for a very long time.
The report came on the 12th of July 2019 from Digitimes, which has a mixed track record through its sources in component and manufacturing companies. It contradicts Ming-Chi Kuo, an analyst who in March claimed that supply sources confirmed that Apple may start producing AR glasses as soon as the end of 2019. But now Digitimes says otherwise also citing supply chain sources. The outlet claims that Apple has disbanded the AR team and has halted its fabled AR Glasses project for the time being. It seems that the technology is nowhere near ready just yet.
Still developing technology
This all leaves AR glasses in a unique position, right now it’s a tech that is worth developing, as we can see from companies pushing forward but this development seems to be mostly in the enterprise space, leaving a huge market for any contender that can bring out a viable and reasonably priced product into the consumer space.
This is not to say that there are currently no good options to choose from as Vuzix Blades, produced by the company Vuzix, are considered the best smart glasses available in the consumer market right now. They are the first smart glasses to come with Alexa, Amazon’s virtual assistant. The eyewear uses waveguide-based, see-through optics to project a display in front of the right lens, which is full-color and utilizes Digital Light Processing (DLP) technology. At the heart of the Blades are a quad-core ARM CPU, which work in conjunction with Android OS.
So should you be rushing out to get a pair of AR glasses? Well the short answer is “NO”; the technology isn’t there yet and major advancements seem to be restricted to the Enterprise Space for the time being. Hence, unless you are an enterprise user and work in an environment where AR can be of use to your production plans, its best to just keep track of the technology for now.