The two stories in this Apple Developer News post set me thinking about how visionOS brings new interaction and UI problems to solve with shared and collaborative user interfaces.

Most of the devices we use today are designed to be operated by one person. Yes, we sometimes gather physically around a screen or in a conference room with a projector, but one person is almost always in control.

That's the same for a headset, of course, but the Vision Pro is about to move user interfaces into the real world, which is very much a shared space! If we bring AR visualisations like the ones shown in the examples I linked to above, part of the benefit of those visualisations will be collaborating around them, but questions quickly arise when several people get involved.

Imagine walking around that model of the Formula 1 car while another person watches you walk straight through their version of the same 3D model because no headset knows where other headsets have positioned everything. Shared UI state will be critically important as soon as you have a second AR device in the same room, and even more so in a future where everyone might one day wear lightweight AR glasses. Add animation or interaction to the visualisation, and that state synchronisation becomes even more important.

But it goes even further than that. Do you also need to share the position of the UI window that contains the parameters to control the airflow simulation? Or will that mean people are bumping into each other to make adjustments? Then, how will you choose whose inputs “win” if two people change the same parameter simultaneously? Or do people get their own controls and simulations to tweak without interrupting others? Will voice control provide a better experience instead? There are so many questions to answer and assumptions we’ll make based on how we interact with physical objects in the real world.

In the meantime, all the technology we need to make the underlying architecture for this shared state already exists. As mentioned in the PTO article, SharePlay can help, and for devices close to each other, we also have Multipeer Connectivity.

It might be time to break out the Creating a Multiuser AR Experience and SwiftShot sample code projects again!

Dave Verwer  





Founder/CTO @ XLIO – An opportunity to lead the development of a greenfield project requiring deep macOS integration (this is not "just another" Swift app) which will be installed on hundreds of thousands of devices worldwide. – Remote (within US timezones) with some on-site work (United States)

iOS Developer @ KURZ Digital Solutions GmbH – Join KURZ Digital Solutions! Take the lead in developing innovative apps as an iOS developer and explore modern technologies in a dynamic team. Experience a culture of learning and creativity that combines tradition and digital innovation. – Remote (within European timezones) with some on-site work (Germany)

Senior iOS Engineer @ Luma AI – We are a small AI research and product company working on new kinds of creative tools for 3D. Our mission is to democratize the 3D experience for all. iOS at Luma is at the center of the product universe. We are growing the iOS team from 1-4, please reach out if you're interested! – On-site (United States in CA) with some remote work (within US timezones)


Is your company hiring? You can post your open positions for free over at iOS Dev Jobs.


And finally...

How about a trip back in time this weekend?

At the time of this writing, Apple ][ is an antique and MS-DOS is obsolete. However, Appler still runs on DOSBOX on Windows or on Mac. So now you can play Apple ][ games running on Appler running on DOSBOX running on MacOS. :)

That’s quite the stack of software, but it’s easier than trying to find a real Apple ][!