Subscribe for weekly commentary and coverage of Swift and Apple platform development. Written by Dave Verwer and published every Friday. Free.

Picture of Dave Verwer

Issue 643

12th January 2024

Written by Dave Verwer

Comment

Get ready! 😍

The time for wondering when “early next year” might happen is over, as Apple announced availability for its newest platform earlier this week, and it’s only three weeks away! 🎉

Does that mean you can submit apps to the store already? Yes, it does, and if you can’t wait to see what the App Store looks like on launch day, then Steve Troughton-Smith has a good Mastodon thread where he invited people to talk about their in-development visionOS apps.

It’s easy to be sceptical about whether Vision Pro will be a success, and I can’t say I don’t have some of those feelings, too! The widespread and mainstream adoption of an AR platform is a huge task from where we are now, and there’s no guarantee it’ll succeed, even with two¹ of the biggest companies in the world putting themselves behind it.

But the product that will be available in stores in a few weeks isn’t the “widespread adoption“ version of this type of device. The first few iterations over the next few years are here to lay the foundations for widespread adoption. Naturally, Apple isn’t going to pitch it like that. They’ll want everyone to know this is “the best Vision Pro we’ve ever made“, which is the truth, and which builds on the previous “best AR device we’ve ever made”, which was holding up an iPhone or iPad at arm’s length.

The other half of the adoption story is software, of course, and that’s where we developers come in. If you’re submitting an app in the next week or two and debut in the visionOS App Store on day one, and if spatial computing does go mainstream, it will be partly thanks to you! 😍


¹ I’ve not used a Meta Quest 3, but I understand its AR capabilities are significantly improved over previous versions.

Dave Verwer

Join the FREE iOS Architect Crash Course

If you’re a mid/senior iOS developer looking to improve both your skills and salary level, join this free online crash course. It’s available only for a limited time, so get it now.

News

Hello visionOS Developer

I like these relatively new round-up posts that Apple has been posting. This one is (predictably) chock full of great visionOS links, so I thought I’d include the whole thing here. Make sure you don’t miss the write-up of the most popular Q&A from Vision Pro lab sessions.

Code

Reading and Writing Spatial Video with AVFoundation

What does MV-HEVC mean to you? I hadn’t heard of it, either, until I read this great post from Finn Voorhees on the additions to AVFoundation that allow reading and writing of spatial video files.


Perception: A back-port of @Observable

If you’ve been unable to adopt Observable yet due to it needing Swift 5.9 and the latest runtime environment, then Brandon Williams and Stephen Celis have something for you. 🎉


Exploring visionOS Accessibility Gestures

I’ve not read much about accessibility in visionOS yet, so I was happy to see this article covering some accessibility-specific gestures from Rudrank Riyam. I hope to see him (or someone else!) dive deeper into this topic, as I’d like to learn more.


Calling Swift from C++ code

This might not be immediately useful, but I’m always heartened to see better interoperability between Swift and any other language! Thanks to Uli Kusterer for posting about this!

Design

Adapting your App Icon to visionOS

It only struck me how different visionOS icons were from other Apple platforms when I saw Flora Damiano showcase them all side-by-side in her latest post. If you’ve been putting off the “Design visionOS icon” item in your task list until now, start by reading this post! 🥽

And finally...

As with the answer to most questions, the answer is it depends