CommentComment

If you have been watching social media since the launch of Xcode 15 beta 2 with the first public beta of the visionOS SDK, you’ll have seen a bit of a rollercoaster reaction from the community.

Spatial computing in visionOS is quite different from what everyone expected from the last few years of watching ARKit develop. While bringing iPad apps and all the lessons we have learned from building them straight into the device is a massive boost for the platform, I think many people were also expecting to be able to create more than “windows in 3D space”.

I know nothing of Apple’s plans, but I’m 100% confident that what they are showing us here is a small step towards what they want visionOS to mature into. This version of visionOS runs on a headset that you’ll probably never wear outside. It’s also the visionOS that’s not only a brand new platform but a whole new environment to run apps in, and one which is unlike anything most people have developed for.

That said, it’s telling that even some things that felt like Apple was lining up for visionOS are not part of what the Vision Pro can do. In my last comment before the keynote, I suggested that first-party apps might get more freedom with sensors and the screen than third-party ones would, but I think this is Apple being measured with everything they ship to let people get used to this new type of product. I’m confident that as the hardware develops and becomes more practical to wear for long periods and in environments outside the home, office, or an airline seat, we’ll also see massive changes in software, taking the software beyond what is possible today. I’m also confident many of those features already exist in some prototype form inside Apple Park.

Putting any computer-generated visuals in front of “reality”, whether that reality is viewed through that glass of some spectacles or via cameras in a ski-mask-type device, hasn’t hit the mainstream¹ until now. I won’t speculate on why Apple decided that this feature set was the release feature set, but I’d imagine it relates to their ability to ship something stable, power-efficient, and safe².

I’d also argue that visionOS, as presented in the keynote, session videos, and now through the beta SDK is a bold step. It’s just another case of Apple taking our expectations and doing something different.

Dave Verwer  

¹ Both Google Glass and HoloLens found other more specialised uses in the industry, but they probably won’t be consumer devices any time soon. You may also question whether a $3,500+ device can be called mainstream, and you’d have a point, but it’s clear from how Apple pitched it that Apple intends it to be.

² I am sure Apple worries about rushing into largely unknown areas like how distracting AR will be when navigating the real world, and also the potential mental health issues caused by constantly seeing things that do not exist.

 

News

Tools

Code





Design

Jobs

iPad Software Engineer @ Liquid Instruments – Liquid Instruments is a startup creating a range of modern test and measurement devices using reconfigurable FPGA hardware. We're looking for someone to help develop the beautiful iPad user interface that drives it all. – On-site (Australia)

Senior iOS Developer @ komoot – You’ll team up with four world class iOS engineers and take over full responsibility for our iOS app. You’ll develop diverse features for navigation, routing, social interaction and content visualisation that will make your work challenging and fun. – Remote (within European timezones)

 

Hiring? iOS Dev Jobs will list open Swift positions for free! Please feel free to post your open positions.

 

And finally...

Fix the 2D tree sprites, and I’m all in!