2 min

Tags in this article

, ,

Apple is already trying to get developers excited about the Vision Pro. The company is releasing an SDK today so that developers can build their own apps for the Vision Pro.

Apple’s mixed-reality headset was officially unveiled during WWDC 2023. Finally, as enormous amounts of speculation preceded the announcement.

Now waiting for the Vision Pro to become available for sale remains. That is scheduled for spring 2024. Interested parties may taste deep into their pockets, as it will cost you $3,499 to have Apple’s technology glasses in your home.

Read more: Apple Vision Pro: will it shake up the way we work?

Getting developers excited

In the meantime, Apple is already starting a charm offensive aimed at developers. Namely, it is launching development tools to create applications for visionOS, the operating system on which the headset will run.

“By taking advantage of the space around the user, spatial computing unlocks new opportunities for our developers, and enables them to imagine new ways to help their users connect, be productive, and enjoy new types of entertainment,” said Susan Prescott, vice president of global developer relations at Apple.

Developers can unleash their creativity in familiar environments they already use for other Apple platforms, including Xcode, SwiftUI, RealityKit, ARKit and TestFlight. Additionally, the new Reality Pro Composer tool will launch in Xcode, allowing developers to preview their 3D models, animations, images and sounds.

Of course, without actual Vision Pro headsets, it remains difficult to test apps properly. Apple is trying to solve that with a simulator to show how the app comes to life in the mixed-reality headset. Starting in July, developers can apply for a developer kit, giving them direct access to the Vision Pro. Furthermore, it is opening two labs in Europe (Munich and London), where the mixed-reality headsets will be ready for testing.