CommentComment

Did you see Apple announce door detection and other iOS 16 feature previews for Global Accessibility Awareness Day a few days ago? Don’t you love living in the future? I know I do. 🥰

Of course, Apple didn’t say it, but as impressive as door detection is on a phone/tablet, it transforms into a life-changing one inside AR glasses.

It set me thinking on a topic I haven’t written about before or even seen get much public discussion. How will apps in an AR operating system share the field of view?

The first, somewhat obvious way is via a notification system. We’ve seen this with Google Glass and other projects, and it’s safe to say this will be a feature of any AR software platform.

But what about apps that want to composite information over the top of your field of view? That needs to be a completely different model than we have with iOS-based devices. It makes no sense for an app to be able to cover the entire field of view and no sense for apps to be “brought to the front” in the same way that you interact with apps on iOS. Imagine if you had to launch a DoorDetector app before it would start monitoring, especially if you were following AR directions with the Maps app to find the building with the door.

So, background processing of sensor data will be critical, but how many apps will be allowed to process simultaneously, and how on earth would you deal with the privacy implications of needing to inform users of what apps are watching what sensors?

Then, let’s think about how those apps get to draw into the field of view. Again, there is so much potential for chaos. How many apps can draw simultaneously? How much of the field of view is each app allowed to cover? If multiple apps are drawing, does the operating system need to monitor the total coverage? I could list mind-blowingly complex problems that Apple would need to overcome for days!

Also, everything I’ve talked about here is software-related and doesn’t even consider that every time a sensor gets monitored drains power from a battery that can only be so big when attached to your head!

On one hand, I love these problems as I’ve never had to consider anything like this before, and that’s exciting. On the other, the downside is that these problems are so challenging that it’s likely that the first several iterations of anything that ships to the public will likely approach third party apps exceptionally cautiously. I expect plenty of Apple-only entitlements and slow progress for third-party apps in the first few years.

My gut feeling still says that this is not the year for a big AR announcement, and I still don’t see any signs of them being interested in VR, despite plenty of rumours. The most likely exception is for the VR/mixed-reality product is a developer kit, but I still don’t buy it.

Maybe something in 2023?

Dave Verwer  

News


Tools


Code




Videos


And finally...

What platform should you write your next game for?