It doesn’t have a name or a price as of yet, but Sony is set to produce a new spatial computing headset targeted at business users. Siemens is the only partner announced so far. Together with Sony, it is showing how organizations can benefit from “immersive engineering” within the “industrial metaverse.”
That’s a lot of marketing terms at once. Whereas we’ve previously spoken of virtual reality, mixed reality and extended reality (XR), among others, we now summarize it as “spatial computing”. It’s a term Apple introduced last year when it unveiled the Vision Pro, and it’s as good a term as any to collectively describe the initiatives in this sector.
Today, Sony and Siemens are unveiling a product for “spatial content creation,” set to be released later in 2024. The headset resembles the PlayStation VR aimed at consumers, including an ergonomic headband for comfortable use. However, its application is entirely different to the PSVR: It targets designers and engineers to work together “intuitively” on design concepts in 3D.
The headset itself uses the recently announced Qualcomm Snapdragon XR2+ Gen 2. We previously knew of four other OEMs that will deploy this SoC, but now every manufacturer is known. In any case, the Sony headset will deploy two 4K OLED microdisplays to achieve the sharpest image possible.
Industrial metaverse
Key to Sony and Siemens’ announcement is actually the software. Siemens NX Immersive Designer will serve as an integrated solution to enable product engineering. Within that app, then, will be the design of 3D creations. Realistic rendering is partially handled by the Qualcomm SoC, but also relies on cloud resources.
Among other organizations that took to the stage, F1 team Red Bull Racing will use the new tools. The emphasis at the Siemens keynote during CES 2024 is on collaborative work. The “industrial metaverse,” a term that Siemens used often during the presentation, gets ever more concrete as a result. For Siemens, it is a catch-all term for the deployment of virtual tools in an industrial setting. For example, current applications of the Microsoft HoloLens would also fit that descriptions. Notably, that headset is used to assist in the assembly of aircraft parts, visually guiding employees to use the right bolts in the right places, among other things.
Specifically, “immersive engineering” means that employees can adjust a design together in real-time in 3D. In the process, users don’t all have to be in the same room, it is claimed. Red Bull Racing says it could make design changes within a few hours instead of a few weeks with the new tool. Note that these are styling choices for the RB17, a very exclusive car intended for normal road use. It’s not going to be applied for actual performance-targeted adjustments on its F1 car, then.
Applications are certainly there, but precision seems to be pain point
It remains to be seen how precision-focused the industrial applications will actually be. In particular, Siemens seems to be presenting the Sony headset as a solution to show off concepts, without directly adjusting technical aspects. The aforementioned styling choices on a car could thus be shown to C-suite execs in quick succession. It would eliminate the need for a physical scale model, for example.
The bombastic-sounding “industrial metaverse” thus harkens back to an already proven application of spatial computing. In this, the Siemens-Sony collaboration makes a lot of sense, given their expertise in industrial applications and virtual reality, respectively. It could thus rival the Apple Vision Pro for business users, even if it remains to be seen exactly what applications that device will excel at, as well. For now, there has been conspicuous silence around app development for visionOS, Apple’s headset operating system.