Craft At WillowTree Logo
Content for craftspeople. By the craftspeople at WillowTree.
Engineering

WWDC 2024 @ Mill One: Our Developer Community’s Take on What’s Next for iOS

WWDC has always been a big deal at WillowTree. Viewing parties, snacks, snarky threads on Slack, and a few lucky folks ending up at the event IRL — it’s one of the most exciting times of the year for our iOS app developers and designers.

Neon Trees playing at the 2012 WWDC Bash, installation of scrolling App Store icons at WWDC 2012, WWDC signage at the Moscone Center in San Francisco

WillowTree has grown a lot since I started in 2011. We now have 18 global studios, a Work From Anywhere cohort, folks across multiple continents and time zones. This year, we decided to do something different and more intentional: bring our iOS developers together in one location to celebrate and learn together. Getting together “off-screen” really showed off the magic of strong platform communities.

Our iOS All-Stars gathered in Durham's Mill One location for WWDC 2024

In addition to watching the Keynote and Platform State of the Union together, WillowTree presenter exclusive sessions shared tutorials and case studies from our own client experiences: how we built a visionOS application for a leading financial services firm, how to write beautiful idiomatic SwiftUI code that feels right at home alongside Apple’s, and some SwiftUI accessibility tips and tricks to make our client’s applications just as inclusive as iOS itself. It’s a real power move to take over a WillowTree location and book four conference rooms round the clock for a whole week to view WWDC Sessions — but oh you know we did!

Watching, discussing, and presenting is all very exciting, but my personal favorite part of WWDC is the race to download Apple betas and start exploring and hacking away with the new APIs to see how they fly. We’re jazzed about all the new nerd stuff we got, and some of the new user features might just convince us to install that Day 1 developer beta on our daily drivers.

After playing with this stuff for a week, and gathering input and feedback from the entire team, we have some thoughts (my colleague Ellie Giles recently shared her Executive Insights from WWDC, definitely check that out too). First, I’ll give you a quick take on the biggest updates and releases that our web, mobile, and Vision Pro clients need to know about. Below that, I’ll fully nerd out for my iOS developers out there, offering our collective review of some new Apple SDKs and developer tools. Let’s go!

Peter Kos schooling us on writing beautiful idiomatic SwiftUI, Matt Jones gives us a demo of our first official client AVP app, and Kat Marsh Griffin gives us tips and tricks on making Accessible SwiftUI apps for everyone

Some Big Ways WWDC 2024 Updates Will Impact Your Applications Now

Apple Intelligence

Wow. It’s not in the developer beta yet but these Apple Intelligence features look incredible. AI tools have been a bit of a secret power for those willing to dive into the tech. Get ready for it to come to the masses in its most polished state yet.

Here’s where the friction has been to date: AI features can sometimes feel disjointed, having to constantly shift to multiple tools and UI to copy context and content back and forth between applications. Now, Apple’s vision of integrating all that OpenAI functionality — combining it with whatever I’m working on and essentially baking it in system-wide — has me wishing it was already Fall.

I’m probably the #1 ChatGPT fanboy, and spent a good portion of the last year adding seamless ChatGPT functionality to our augmentative and alternative communication app Vocable AAC, one of the first apps available for the Apple Vision Pro. Apple is taking a similar obvious next step, leveraging your personal data and context and connecting it to your favorite mobile apps and their actions through App Intents (this is going to be huge, more info coming soon), with the ability to go off-device and call to ChatGPT when the on-device intelligence isn’t quite powerful enough for a given task. AND making it all easily accessible through Siri voice technology, Text to Siri, and other multimodal UI means that we’re about to see an even broader explosion in the adoption of AI functionality across our devices and existing mobile applications.

There’s something potentially a little scary about getting these tools in the hands of WAY more people, but the optimist in me thinks everyone getting more exposure and education on what’s possible today gives folks better radar for what’s “real” and what’s machine-generated.

Also, Apple’s focus on privacy on-device and off-device makes me feel confident about where things are headed, as Cupertino looks to be taking a measured approach, making sure AI isn’t “The Whole Feature,” but rather supports, augments, and improves already solid digital experiences. Apple’s partnership with ChatGPT looks equally measured: during the keynote, when OpenAI features were used there was for sure a “Hey, you’re leaving AppleLand now, are you sure you want to do this?” vibe.

We’ll see how it materializes once these features hit the betas and GM releases, but WillowTree’s iOS developer community overwhelmingly agreed that this functionality could fundamentally change how users engage with all their mobile apps and devices, especially early adopters like the QSR, retail, travel/hospitality industries, and everyday weather/traffic apps.

Home Screen & Control Center Customization

I’m always a fan of software updates that make a device feel new again, and this week Apple gave us a bunch of design and functionality changes that do just that.

Control Center looks and feels better than ever, including a cool new UI to modify and place custom controls. That’s a big deal considering our apps now gain the ability to provide controls that can appear in Control Center, the Lock Screen, and be attached to the Action Button – lots more real estate and entry points in a digital experience, and our clients’ apps now have the opportunity to be way more integrated into the overall Apple ecosystem.

We’ll touch more on this when we cover App Intents in a forthcoming article, but we’re more rapidly entering a digital world where applications are not used individually and are instead intelligently strung together into a pipeline of actions and functionality provided by multiple applications to accomplish the user task at hand.

Think like the Unix Philosophy but for Applications using natural language and LLMs. Or the Rabbit R1, but, you know, it actually works. These multiple new entry points into our applications could be the start of that future, and we want to make sure our clients apps will be going along for that ride and not get left behind.

New UI Patterns & Delight Features

You don’t get into Apple products without also getting into great design, and we have some really fun new stuff this year. The new Tab View on iPadOS looks and feels amazing — the more distance we put between ourselves and the hamburger menu the better. The new interactions when resizing controls and widgets are super slick. We’ve seen the Sports app and now Photos say goodbye to the tab bar altogether, bringing us new UX ideas we can draw from when designing our clients’ apps.

Also on the design side: larger app icons with no titles make the home screen look simply gorgeous. Dark Mode icons look AMAZING. Tinted icons look… well, somebody out there will be into that!

Oh, my personal favorite, you want to know what Apple means when they talk about delight features? Get that iOS 18 beta on your device (🙈) and press the volume buttons!  Pressing the side buttons causes the bezel to “squeeze” inwards? No, that’s not my favorite new interaction, I'm just running around the office showing every single person for fun!

Don’t forget the Mac!

The iPhone is the biggie, but macOS wasn’t left out this year, seeing how we can now wirelessly use our iPhones right from Mac with iPhone Mirroring 😏

Seriously though, little quality-of-life features that lower friction throughout the day are big for me, and we got quite a few. Window tiling will make it even easier to copy that code from stackoverflow... er, I mean, copy that code from ChatGPT view documentation right beside my code! More video conferencing updates that I’ve come to love using instead of the 100 different versions across 100 different apps. All my notifications in one spot working harmoniously with StandBy Mode!

Calculator on iPadOS

We did it! But for real, woah! Math Notes was one of the most impressive tech demos, showcasing many AI/ML capabilities all packaged into an app we’ve been waiting for for some time! We can expect classroom usage for both Teachers and Students to visually and interactively learn. The ability to edit handwritten notes and have them reflow in our own handwriting bridges the gap between ease of editing and searching with electronic notes and the natural feeling of writing acoustic notes on pen and papr.

What Else?

If you’re an Apple nerd and need to know about every little detail of the new features we’re looking forward to later this year, including what changes as the betas roll out, keep MacRumors in your daily rotation and you’ll be as up to date as us!

Ok, What About the Nerd Stufftm?

This year feels like one of the biggest WWDCs since SwiftUI itself. I’m going to share some of what our team has been playing with that we think will most impact our clients and development practices. If you’re looking to dig into topics most relevant to you and your applications, I’d highly recommend the Apple Developer application available across every Apple platform. Well, except for the Apple Watch (at this point, why not collect them all?). Maybe next year!

Swift 6

Data-race Safety

The new Swift 6 language mode enables us to ensure we don’t have any data races in our applications at compile time. This is HUGE. Data-race safety has been a primary goal of Swift Concurrency throughout its evolution, and we can now start to migrate our modules incrementally towards Swift 6 language mode to gain this benefit. We’re very happy that Apple has implemented this in a way that lets us migrate our applications’ modules one at a time. It’s ok if some dependencies or modules don’t yet support this feature; it shouldn’t be as bad as a Swift 2 to Swift 3 migration, and our initial experimentation with sample codebases was promising.

Why are we so excited about this?

Data races occur when one thread accesses a mutable object while another thread is writing to it. Consider a bank account: if we have two threads both attempting to read the current balance and perform a transfer we may end up overdrawing. Swift 6 allows us to fix this without entering the cursed manual locking mechanisms zone.

Just as crashes became almost non-existent after Swift gave us Optionals, we’ll now have compile-time data-race safety, eliminating an especially tricky type of bug.

Resistance is futile. You will be assimilated.

Swift is supported on Apple, Linux (now including Fedora and Debian!), Windows, Embedded systems, and web assembly. Apple has official packages for web servers (including Vapor!). New this year is the ability to compile Swift applications as a static Linux binary, which can be run on systems without Swift runtime.

Why are we so excited about this?

Cross-platform development is becoming increasingly popular (especially Flutter, React Native, and Kotlin Multiplatform). Another growing industry trend is a move towards the quality and development of the full application’s stack being “everyone’s job.”

Depending on the makeup and knowledge of your team, choosing Swift for server-side or tooling applications might make a lot of sense — and it’s getting easier every year to support this. Swift Package manager now feels viable. Foundation is now written in Swift and available on non-Apple platforms (and if you’re not an Apple developer and are new to Foundation, you’re in for a treat!). Now you can still work on your Mac that you love, with the tooling you know, and deploy to Linux! You know you love Xcode, and it loves you too!

Swift Testing

Automation is a key part of a high-performing software team. Our enjoyment of our tools has a big impact on how much we end up actually using them. XCTest is fine, but Swift Testing looks fun and powerful!

Why are we so excited about this?

Gaining that nice expressive API we’re using with Swift in areas like SwiftUI in our testing code is just a 5-star quality-of-life enchantment to an often overlooked part of a code base. Swift Testing contains features that lessen the friction in making our code testable and writing better tests: customizing tests with traits that control if a test should run, getting better and more descriptive failure messages, and easily browsing and organizing tests and their results. Parameterized tests (YES!) make running a test on a sequence of values much more effortless and nicer-looking.

And no need to wait until your app targets iOS 18 to use Swift Testing. This is available today on all platforms that Swift supports. Swift Testing can live right alongside XCTest. Add it to your project with Swift Package Manager and see what improvements you can make immediately.

Can you remove a series of tests and replace them with parameterized tests? Do you have any tests that yield hard-to-understand and diagnose failure messages? Start migrating your tests incrementally now to get an idea of what’s possible for your future testing needs!

Swift Testing may be a great way to get your development team more excited about eating their vegetables, and it may lead to your application’s quality hitting the grade that it needs to. Anything that decreases the friction of writing tests and deriving value from the reporting of those tests encourages writing better, more maintainable code.

Swift Assist

Swift Assist is Apple’s answer to GitHub Copilot. Swift Assist is a natural language assistant built into Xcode to help you write code. It’s not present in the current betas, so our current impression is based only on what we’ve seen in Apple’s WWDC content.

Why are we so excited about this?

I’m a massive fan of using AI assistants to write code. The other week I pasted a database schema (that ChatGPT also helped me write) into ChatGPT and asked it to create a view for creating a SwiftUI form to create one of the models represented in that schema. The output was about 95% “done.” I’m consistently blown away with what we’re able to do with these tools.

How may Swift Assist be better or different than using Copilot or pasting code back and forth between ChatGPT? Two things I’m hoping for here:

  • It appears that Apple’s assistant is able to operate on your code “in place,” while considering your entire codebase. Using Copilot or GPT often involves a fair amount of copying and pasting context back and forth, and leaving comments like “// The above view refactored to include XYZ.” If Swift Assist can take my direction and alter/amend my code in place as needed, that’s a huge win.
  • I’m hoping for a much better understanding of Apple APIs and idiomatic Swift usage. Considering that the other tools are largely trained on public code, we often see output that contains questionable coding patterns, outdated implementations, and even hallucinations with various causes. If Apple is able to train its model on its own up-to-date SDKs, its own idiomatic usages of APIs, and presumably some large dataset of “good” code, we can expect much better results than the most average code from StackOverflow.

SwiftUI

SwiftUI is the name of the game now. If your application is still written primarily in UIKit you’re missing out (so long as you can set a reasonable deployment target like iOS 16).

Most of the new features announced at WWDC won’t be available unless we’re targeting iOS 18 or implementing specific functionality for iOS 18. But we got some goodies we can use right when Xcode 16.0 comes out, and you know we’re exploring the iOS 18-only features right away to be ready for them when they’re released.

SwiftUI Container

Probably the feature we’re most excited about (aside from all the fancy Apple Intelligence stuff) is SwiftUI Containers. Something we’ve been playing with for a while using Variadic Views by reading a couple great write ups and diving into the symbols now has a more official API and is actually documented.

Why are we so excited about this?

Clients often want a slightly custom experience regarding features like navigation or collections of content. It’s often a mistake to try to get Apple components to do things they aren’t meant to do — we end up writing custom controls quite often to avoid this. You lose something more than all the great accessibility features and automatic animations and behaviors of these controls (and that’s with the good APIs that power them).

While it’s possible to make your own tab view using the SwiftUI of yesterday, you end up with an API that’s less flexible or doesn’t feel quite right alongside Apple's containers like VStack, List, etc. With the advent of the new ForEach and Group initializers that accept a view and allow us to operate, reason about, and extract new container data from sub-views, we’re able to build custom container views that feel right at home alongside Apple’s.

Looking at what we’re seeing with the Sports app, Photos, and the new tab view on iPadOS, custom containers enable us to build these new types of UI interactions and organizations without having to write too much code or give up flexibility or simplicity in our APIs. Thanks Apple! This was probably our favorite session of the week!

The ability to make our code feel as natural as Apple’s doesn’t just give us that nice warm and fuzzy feeling. It also enables our APIs to more elegantly compose with and take advantage of Apple’s APIs alongside ours. Most importantly, it greatly reduces the friction and learning curve for new developers entering a codebase. If you use Apple Music, you should feel at home in Photos, Keynote, Pages, etc. If you’re familiar with Apple APIs, we want developers to automatically feel at home using our own APIs.

Navigation Transitions

Remember when iOS 7 came out and we all tried to add that cool zoom effect going from a collection or list into a detail view? Remember those terrible, clunky, confusing APIs? Neither do I, because after we tried it once, we sure didn’t try it again.

Now, similar to the matched geometry effect that animates/tweens between two Views, we can achieve this effect across navigation transitions in both UIKit and SwiftUI with a dead simple new API. The Photos app is a great example of how good these interactions feel when navigating an app: it feels alive, polished, and just plain FUN! If you’ve said no to adding these features to your app — due to timeline constraints or the difficulty of implementing — it might be time to reassess and add more delight to your app.

Even though these will only be available on iOS 18+, a simple OS version check lets your users on the latest and greatest have a little more fun with your application.

Why are we so excited about this?

Honestly if I never have to read the words UIViewControllerInteractiveTransitioning ever again I consider this a win.

Jokes aside, these transitions are part of what makes Apple software feel so much better to use than their competitors. A lot of work goes into that. Or, at least, a lot of work used to go into that 😉.

Tab View

We’re huge fans of stock navigation here at WillowTree. It’s a core part of your application. It’s how your users navigate through the most high-level categories of app content. Using a stock control gets you a lot: accessibility voiceovers and tab displays, right-to-left languages, “freebies” whenever Apple adds new abilities, etc.

For such a core part of an application, making something foreign and custom may mean giving up more than you gain. But sometimes, you do want a more elevated experience.

Apple’s redesigned tab bar now floats above your content, reducing the horizontal and vertical space occupied by the element. Sidebars have been updated to support quick top-level access to features within the app. In keeping with the theme of customizing higher-level elements like the Home Screen and Control Center, these new elements are user-customizable as well.

Why are we so excited about this?

These changes allow users to easily surface and navigate to the content they care about more quickly. Rarely does a user want to use an app just for fun. Content is king. Users want to perform their actions without having to learn new navigation paradigms. Differently-abled users expect higher accessibility support on Apple platforms. Adopting this new top-level navigation style empowers your users to get stuff done in your app as quickly as possible and on their terms. We see this as the obvious choice for applications requiring top-level destinations, now with a more personalized touch! And, with SwiftUI’s declarative syntax, we don’t have to worry about clunky APIs like the old master detail UIKit APIs.

UITabBarController has been around for a long time. There’s a reason for that. It’s a solid high-level navigation pattern, but it looks like it was time to evolve a little bit. We sometimes find ourselves reaching for custom navigation patterns like hamburger menus and the like, but that’s never felt quite right. Tab View helps iPadOS warrant its own name apart from iOS and gives us a solution to the growing number of app features while providing the user affordances to customize and focus on the use cases they use the most.

visionOS 2: Enterprise Boogaloo

Apple Vision Pro is one of the most impressive pieces of technology I’ve ever strapped to my face. We’ve had a few clients take the leap into Spatial Computing, but a common theme of ideation sessions is: “Sorry, developers don’t have access to that piece of the AVP technology.”

Now, for enterprise-distributed applications (i.e., “for use in a business setting only"), we don’t have that problem anymore. Apple breaks these new Enterprise APIs into two categories: Enhanced Sensor Access, and Access to Advanced Machine Learning Capabilities using the Apple Neural Engine. Add to this the ability to tune the performance of our applications to achieve more compute for intensive workloads.

Enhanced Sensor Access

We can now access the AVP’s forward-facing main camera, passthrough in screen capture, and spatial barcode and QR code scanning capabilities. This means we can now create applications that leverage seeing what the user sees. Some obvious applications include taking warehouse inventory or training a QSR employee on how to combine ingredients, or building an application that tells you what kinds of LEGO builds you can create given the pieces in your field of view. Lots of enterprise applications and opportunities here and we’re excited to dig in.

Machine Learning / More Compute

Here, we get access to the Apple Neural Engine for machine learning tasks, object tracking parameter adjustment to optimize known object detection and tracking, and increased performance headroom. This means we can squeeze more power out of the application — if your visionOS application wasn’t running as performant as you’d hoped, maybe that’s no longer a problem with the ability to trade increased CPU and GPU power for greater thermal usage and reduced battery life.

We’re excited to see what’s next with Spatial Computing. If you’ve investigated an internal business use case for Apple Vision Pro and the initial takeaway was, “We can’t do that,” it might be time to chat again!

Conclusion

WWDC24 Week is over but the fun is just getting started. I don’t know about you but I’m excited about apps again!

Here’s your homework as we progress through the betas into the GM release later this year:

  • SwiftUI is ready for Prime Time. We’re at the stage of UIKit’s evolution where things started really feeling “done.” SwiftUI enables you to build beautiful, dynamic, high-quality applications faster than ever. Especially if you’re able to target iOS 16 (when we got the improved Navigation APIs), start your new features with SwiftUI first. If you’re making a new app, aside from a few features like SafariViewController, you’re ready to dive in with a 100% SwiftUI application.
  • App Intents are the key to unlocking your app's integration with amazing Apple Intelligence features. Download the Shortcuts app and get a feel for what’s possible today, and start identifying and implementing the appropriate actions in your app now so you’re ready for the next big thing.
  • Apple Intelligence enables so many exciting new features… but it’s not available in any of the betas yet. Follow this one closely. Lots of companies make big promises and fail to deliver. I don’t see that happening with Apple, but I’m curious to see how the feature set evolves as it lands in our hands. We should expect to hear a little more when the public beta hits in July.

Get in touch with our Digital Product Development team to hear our up-to-the-minute thinking about all things iOS, and here’s to an exciting new era with Apple!

Table of Contents
Andrew Carter

Read the Video Transcript

Recent Articles