# 30,000 lines of SwiftUI in production later: We love it but you know there was going to be a “but”

It took a few hours to fall in love with SwiftUI. So much so that we instantly decided to abandon a cross-platform codebase and go fully native on iOS. timing.is shipped this month on the App Store. It was built entirely in SwiftUI. Its development took several months. It would have been less if SwiftUI just gave. Unfortunately, repeatedly it would take.

Can confirm: SwiftUI is nice and easy in when it works, but in general, it’s a giant pain. It’s embarrassing for Apple to release a framework in such a half-baked state.

It was fun at least 51% of the time. But let’s talk about the <= 49% that wasn’t.

Imagine writing lots of code that works well, and then getting stuck for a few days on a tiny thing that just doesn’t want to work. That’s SwiftUI experience.

# Stable Diffusion with Core ML on Apple Silicon

Today, we are excited to release optimizations to Core ML for Stable Diffusion in macOS 13.1 and iOS 16.2, along with code to get started with deploying to Apple Silicon devices.

The repository contains Python and Swift packages. With the latter, you can add Stable Diffusion functionality to your iOS/Mac apps.

Users report that 50 iterations to generate an image that previously took about 3 minutes now take 30 seconds on M1 Macs.