Subscribe for weekly commentary and coverage of Swift and Apple platform development. Written by Dave Verwer and published every Friday. Free.

Picture of Dave Verwer

Issue 723

22nd August 2025

Written by Dave Verwer

Comment

It’s always a good idea to listen carefully to what Apple doesn’t say when they introduce a major new software feature at WWDC.

With that thought in mind, this week Craig Hockenberry wrote a great post titled “Liquid Glass. Why?” and in it, he says:

I’m unaware of anyone outside of Apple who’s thinking “we really need to have more fluid glass in our designs”. Of particular note during the introduction is how much time they spend showing off glass blocks and talking about the physical effect itself. While not addressing the most important question: “why do we need this?”

I’ve been asking myself this question, too. I installed the betas on my primary iOS devices a few weeks ago and while I don’t hate Liquid Glass, I’m struggling to love it. It’s fun to look at, but there’s no question in my mind that it’s a step backwards for readability.

But Craig’s point isn’t about readability. It’s about a few specifics of the new design system and who needs them:

And I’m pretty sure the answer is “we don’t”. The answer is “Apple does.”

and:

It’s like when safe area insets appeared in iOS 11: it wasn’t clear why you needed them until the iPhone X came along with a notch and a home indicator. And then it changed everything.

There has also been an emphasis on “concentricity”. It’s an impossible thing to achieve and an easy target for ridicule. But it’s another case where Apple wants to take control of the UI elements that intersect with the physical hardware.

All of this makes me think that Apple is close to introducing devices where the screen disappears seamlessly into the physical edge. Something where flexible OLED blurs the distinction between pixels and bezel. A new “wraparound” screen with safe area insets on the vertical edges of the device, just like we saw with the horizontal edges on iPhone X.

The user interface work of the past few months will all make a lot more sense, and developers who haven’t been paying attention will have their “holy shit” moment.

Auto Layout was similar. It arrived at WWDC 2012, just a few months before the first iPhone with a screen taller than 480 points shipped.

Is Liquid Glass the latest example of Apple carefully not saying something about this year’s devices? Will it all make sense when we see the full picture of hardware and software together, or will we still be asking if the trade-off in readability was worth it?

– Dave Verwer

RevenueCat’s Shipaton is underway!

The ultimate mobile developer hackathon is back - and it’s bigger than ever. With over $350,000 in prizes, a chance to see your app featured on a massive Times Square billboard, and the support of the global #Shipaton community, there’s never been a better time to get shipping. Learn more.

News

UICoder: Fine-tuning Large Language Models to Generate User Interface Code through Automated Feedback

I remember being amazed when I read Simon Willison’s 2024 LLM wrap-up post when he reported that training LLMs on generated content works well:

The idea is seductive: as the internet floods with AI-generated slop the models themselves will degenerate, feeding on their own output in a way that leads to their inevitable demise!

That’s clearly not happening. Instead, we are seeing AI labs increasingly train on synthetic content—deliberately creating artificial data to help steer their models in the right way.

It looks like this technique is at the root of this UICoder research paper from Apple.

Code

Refractive and Laminated Glass effects in Metal

After talking so much about Liquid Glass this week, it seems fitting to highlight two of Victor Baro’s recent articles. First, in Implementing a Refractive Glass Shader in Metal, he explains why simulating the visual properties of glass is difficult, only to then go and explain how to do it. Then, in Recreating a Laminated Glass effect, he takes it a step further.


Corner Concentricity in SwiftUI on iOS 26

We might as well keep the references to my opening comment going by now linking to Natalia Panferova’s latest post on concentricity. It sounds like it’ll be important to get this one right come September, which is rapidly approaching.


Swift Raw Identifiers

New in Swift 6.2, this is a language feature you’ll want to be a little bit careful with. As Keith Harrison explains, it’s going to be a great step forward for swift-testing test names, but I can see it being overused if you let things slip. 😬

Business and Marketing

Discover TestFlight apps on Indie App Catalog

Are you looking for beta testers for your app? Miká Kruschel writes about The Indie App Catalog’s new feature: TestFlight support. You can publicise your invite, and the site will periodically check availability on your beta so visitors know if it’s full. It’s a nice addition to a well-designed site.

Jobs

Senior iOS Developer (Contract) @ Happy Scale – Contribute to an indie app that empowers people to make positive changes in their own lives! You will help the founder by writing excellent Swift/UIKit code that will add features and refine this user-beloved app. This opportunity is fully remote and almost completely async (occasional meetings). – Remote (within US timezones)

Founding Senior Mobile Engineer @ Neon – Lead mobile innovation at Neon, a fast-growing startup building a privacy-first app that lets users earn from their phone calls. Shape products with real-world impact in a creative, mission-driven environment. – On-site (United States in NY) with some remote work (within US timezones)

And finally...

What’s in that little box? Oh, nothing, just a Vapor web server. 🍓

(Oh, and I’ll +1 the recommendation of Cloudflare tunnels. 👍👍👍)