The metaverse has taken over as the construct to define our future connected lives. And though it’s still ill-defined, there’s consensus that the next era of computing will involve more graphical dimension. That maps to fully-immersive VR, as well as real-world AR interaction.
In both cases, there will be purpose-built tech stacks that develop — just like any other computing paradigm. And though the spotlight tends to shine brightest on user-facing parts of that stack like hardware and apps, the heavy lifting is done behind the scenes by enabling tech.
This is where VividQ lives. And the portion of the tech stack where it focuses is AR display technology. Here, there are generally many moving parts and methodologies to get 3D content to render in line-of-sight and world-immersive ways — everything from depth sensors to waveguides.
Based on deep academic pedigree and its go-to-market thesis, VividQ has planted its flag in the elusive art of holography for AR glasses and other touchpoints. Specifically, computer-generated holography (CGH) involves projecting images so they appear at the correct distance.
“CGH allows us to directly engineer light to create 3D images with the all same visual information as a real object or scene,” said VividQ CEO Darran Milne. “This gives virtual objects depth cues needed to place them correctly in the real world, which is essential for a complete AR experience.”
The Cadillac of Display
Going deeper on CGH, it’s been called “the Cadillac of display” by Google XR hardware director, Bernard Kress. One of its benefits is resolving the vergence accommodation conflict (VAC) — a core issue in AR glasses where a fixed focal length makes close-up images lose focus.
As background, when wearing AR glasses, your eyes tend to focus on real objects in view. Therefore, any virtual objects that are not rendered at the correct distance will appear fuzzy and low resolution. This “focus mismatch” makes it difficult to read virtual text or discern fine details.
Generally, this isn’t an issue for AR glasses that aim for everyday use cases that render digital objects 3+ meters away. But VAC and focus mismatch eliminate other valuable use cases, (think: collaborative design and tabletop gaming), that require examining digital objects up close.
CGH gets around these issues but it isn’t a slam dunk for one simple reason: it’s really hard to do. VividQ — again due to focus and core competency — has cracked the code on challenges like reducing CGH’s computational overhead and rendering it through a replicating waveguide.
Additionally, when designing AR experiences that are location-specific versus the elusive “all-day wearable,” technical advantages can be gained. A great example of this is Tilt Five, which zeroes in on a tabletop gaming experience with a graphically-rich and lauded UX.
Synthesizing all these variables, VividQ’s go-to-market strategy became clear: gaming. For one, it’s a massive market. It’s also performed up-close, conferring immediate advantage to VividQ’s 3D display. And it’s usually done at home so it addresses style-conscious users.
“What we’re doing neatly sidesteps a lot of the technical and practical challenges that AR faces,” said Milne. “Gaming is a good market to start with, but most approaches don’t work because of VAC, the inability to accurately world-lock, and the lack of hands-reach interaction.”
AR Training Wheels
To go deeper on target use cases, we’re talking AR companion experiences. Picture an RPG where first-person perspective is displayed on your gaming monitor as usual, while additional real estate is gained in the airspace around your monitor, showing overworld maps or item inventory.
This use case carries a few key advantages. It doesn’t force behavioral changes on users, which is never a good idea. It integrates AR in additive ways rather than substitutive ways. For example, gamers don’t have to abandon the expensive gaming monitors that they love.
This concept also applies to developers. As Qualcomm XR lead Hugo Swart asserted at AWE USA 2022, “AR-as-a-feature” is a key component of the new Snapdragon Spaces platform. It lets game developers add AR companion experiences, like the above, in low-friction ways.
One key lesson is that AR is too early and unproven to incite large-scale behavioral shifts — for developers and users. So providing AR “ training wheels “ can acclimate individuals… with the longer-term goal of large-scale adoption. Long-run thinking will win, as usual.
The other lesson is that VividQ’s go-to-market approach is validated by a major tech giant. Given Qualcomm’s influence and expanding piece of the AR stack, its penchant for AR-as-a-feature will accelerate the concept, and bring AR elements to more digital experiences.
“We want to provide a gentle ramp into AR to enhance gaming rather than replace it,” said Milne. “In this sense, your AR headset is more of a high-end peripheral, like a joystick or steering wheel. For developers, it de-risks their AR work by letting them offer pro-level gameplay options.”
On the Horizon
Speaking of go-to-market strategy, all of the above takes form in a few ways. VividQ isn’t going to manufacture AR glasses anytime soon. Instead, it’s staying focused on its core competency which is software and IP licensing for its CGH approach, including concept and reference designs.
In that capacity, VividQ works with manufacturers who in turn build hardware for consumer and enterprise hardware brands. Some of its existing partners (at least those that can be named) include Arm, iView Displays, Forth Dimension Displays, and Himax Technologies.
To fully demonstrate and test the technology, VividQ has launched its Alpha program which initially offers a non-miniaturized version of its optical engine. In that lab-bench form (pictured above), partners can evaluate image quality before they adopt and miniaturize the system.
This was introduced last month at SID Display Week 2022, and VividQ plans to double down in the coming months with a binocular system. In short, that involves a similar lab-grade system but one that has two optical engines and a waveguide design that emulates an AR glasses UX.
Other aspirations on the horizon include expanding the addressable market through additional AR use cases. Though gaming is the first target market for all the reasons above, VividQ’s technology is primed for other form factors, such as heads-up displays in automotive windshields.
Through all of the above, Milne believes time and execution will get the tech exposed and the go-to-market strategy realized. This involves pushing the virtues of CGH, and its advantages in delivering it. Given all the factors at play, you might hear the name VividQ more often.