Sponsored Link
Join the FREE iOS Architect Crash Course
If you’re a mid/senior iOS developer looking to improve both your skills and salary level, join this free online crash course. It’s available only for a limited time, so get it now.
News
Black Friday Deals for macOS / iOS Software & Books
I hope everyone over the other side of the Atlantic who celebrates Thanksgiving had a great day yesterday, and what better way to celebrate Black Friday than to give some money to your local artisanal app developer or creator of developer tools? As with every year, activity over on this repository has been frantic for the last few days, resulting in more than a hundred deals available today. 🚀
Code
Variablur
Last week, I linked to Paul Hudson’s new resources on using Metal shaders in SwiftUI, and this week, I noticed a new package in the index written by Dale Price that uses Metal to apply variable blurs to SwiftUI views. I’m not saying the two things are related, but there seems to be some interest in this new SwiftUI functionality.
How to migrate to a new schema with SwiftData in iOS
SwiftData might be all fancy, shiny, and new, but that doesn’t mean you don’t need to deal with schema migrations! Luckily, Natascha Fadeeva is here to lead us through it.
How to customize the macOS menu bar in SwiftUI
The first time I tried to change the menu in a SwiftUI app, the APIs felt a little restrictive, but the more I used them, the more I appreciated how they guide developers towards platform conventions. I’m a fan, and I’d presume that Daniel Saidi is, too, because he wrote this comprehensive look at how to do most of everything to a SwiftUI app’s menu!
Using Observation framework outside of SwiftUI
You won’t find many examples of using the new Observation framework with UIKit or outside any UI framework. That doesn’t mean it’s impossible, though, as Natalia Panferova demonstrates in this post.
Videos
Videos from SwiftLeeds 2023
Start this weekend right by queuing up some of the great talks from this year’s SwiftLeeds conference. With 15 sessions to pick from, you’ll undoubtedly find something that appeals.
Jobs
Founder/CTO @ XLIO – An opportunity to lead the development of a greenfield project requiring deep macOS integration (this is not "just another" Swift app) which will be installed on hundreds of thousands of devices worldwide. – Remote (within US timezones) with some on-site work (United States)
iOS Developer @ KURZ Digital Solutions GmbH – Join KURZ Digital Solutions! Take the lead in developing innovative apps as an iOS developer and explore modern technologies in a dynamic team. Experience a culture of learning and creativity that combines tradition and digital innovation. – Remote (within European timezones) with some on-site work (Germany)
Senior iOS Engineer @ Luma AI – We are a small AI research and product company working on new kinds of creative tools for 3D. Our mission is to democratize the 3D experience for all. iOS at Luma is at the center of the product universe. We are growing the iOS team from 1-4, please reach out if you're interested! – On-site (United States in CA) with some remote work (within US timezones)
Is your company hiring? You can post your open positions for free over at iOS Dev Jobs.
And finally...
How about a trip back in time this weekend?
At the time of this writing, Apple ][ is an antique and MS-DOS is obsolete. However, Appler still runs on DOSBOX on Windows or on Mac. So now you can play Apple ][ games running on Appler running on DOSBOX running on MacOS. :)
That’s quite the stack of software, but it’s easier than trying to find a real Apple ][!
Comment
The two stories in this Apple Developer News post set me thinking about how visionOS brings new interaction and UI problems to solve with shared and collaborative user interfaces.
Most of the devices we use today are designed to be operated by one person. Yes, we sometimes gather physically around a screen or in a conference room with a projector, but one person is almost always in control.
That's the same for a headset, of course, but the Vision Pro is about to move user interfaces into the real world, which is very much a shared space! If we bring AR visualisations like the ones shown in the examples I linked to above, part of the benefit of those visualisations will be collaborating around them, but questions quickly arise when several people get involved.
Imagine walking around that model of the Formula 1 car while another person watches you walk straight through their version of the same 3D model because no headset knows where other headsets have positioned everything. Shared UI state will be critically important as soon as you have a second AR device in the same room, and even more so in a future where everyone might one day wear lightweight AR glasses. Add animation or interaction to the visualisation, and that state synchronisation becomes even more important.
But it goes even further than that. Do you also need to share the position of the UI window that contains the parameters to control the airflow simulation? Or will that mean people are bumping into each other to make adjustments? Then, how will you choose whose inputs “win” if two people change the same parameter simultaneously? Or do people get their own controls and simulations to tweak without interrupting others? Will voice control provide a better experience instead? There are so many questions to answer and assumptions we’ll make based on how we interact with physical objects in the real world.
In the meantime, all the technology we need to make the underlying architecture for this shared state already exists. As mentioned in the PTO article, SharePlay can help, and for devices close to each other, we also have Multipeer Connectivity.
It might be time to break out the Creating a Multiuser AR Experience and SwiftShot sample code projects again!
Dave Verwer