The trains run on time at Apple’s Worldwide Developers Conference, which kicked off at 10 AM PST on June 3. Consistent with Apple’s renewed fascination for all things cinema, the event began with a fairly elaborate video, which could have been mistaken for the opening credits to a Pixar film. All showmanship aside, their presentation on the wide world of Apple products ran the gamut, from the shiny new Pro Display XDR to the release of iPadOS to further differentiate the new multitasking, text editing, and file management functionality of iPad.
For the public at large, the biggest headline was the fact that iTunes—which single-handedly digitized the music industry business model—is going away, to be replaced by Apple’s separate Music, TV, and Podcast apps. This wasn’t a huge surprise: iTunes is practically the poster child for app bloat.
Our overall takeaway? It’s clear that Apple’s 3rd-party app ecosystem is improving, with new AR development tools, a project to help developers design universal iOS-MacOS apps, and more. Here’s what got our attention:
Takeaway 1: iOS apps on Mac have arrived
The announcement of a new Mac operating system, MacOS Catalina, was somewhat expected, but the more exciting takeaway for developers is called Project Catalyst. Starting with the tick of a box in XCode, developers can now build iOS apps to run on MacOS. Yes, there were audible cheers from the audience.
What it means: Through some quick testing on a few of our active iOS projects, it looks like we will be able to target MacOS with minimal additional overhead. Twitter was able to gets its native app to run on a Mac in just a few days, and they manage the app across iPhone, iPad, and Mac with a single team.
What to do now: Catalyst releases this fall, but it’s available now for developers on MacOS Catalina. Ask yourself the following question, and give an honest answer — are your existing iOS apps good candidates for use on a Mac? While dev implications may be reduced, teams must still commit to the appropriate testing procedures, along with building a bank of Mac test devices. To avoid last minute bugs and confusion, teams should be in agreement on if and how MacOS will be supported.
Takeaway 2: Dark Mode lifts spirits
The audience “ooh’d” and “ahh’d” upon hearing that, like MacOS, iOS will now be getting a Dark Mode. Hype aside, Dark Mode looks really good.
What it means: Developers will likely be encouraged to stick closer to iOS standards when designing because they get default dark mode for free. While developers can choose to lock their app’s UI to either light or dark mode (for now), it sounds like this is may be an App Store requirement in the near future.
What to do now: If you choose to build a totally custom UI, you’ll be encouraged to provide light and dark interfaces to match the system setting (similar to how you would handle accessibility for system font size). Start discussing the impact of two color schemes on your design processes and how that aligns (or is at odds) with current brand standards. Following Apple’s lead is a good place to start.
Takeaway 3: Simplicity and Support with Xcode and SwiftUI
Apple introduced SwiftUI, a new framework for building UIs for all Apple platforms. Featuring the drag-and-drop coding simplicity of Xcode design tools and a simple declarative syntax, SwiftUI provides automatic support for Dynamic Type, Dark Mode, localization, and accessibility.
What it means: SwiftUI is native to all Apple platforms. It promises to shorten development time and make it easier to code complex concepts, such as animation. That said, questions remain about how complete SwiftUI is, or if the old way still prevails for highly custom UI. Expect for there to be some upcoming conflict around the best way to build interface elements.
What to do now: Join all the other teams scrambling to learn this new “simplified” way of building UIs, and team up with your web engineers to learn from their experience with the declarative web framework React (according to an Apple engineer at WWDC Labs, SwiftUI was inspired by React). Harness your WWDC excitement to learn now, but expect it to be a while before you can finally ship an app with SwiftUI. In the meantime, watch out for overzealous refactoring of “old” code, which may or may not be necessary.
Takeaway 4: New AR Developer tools: RU ready? (…see what I did there?)
There were a trio of AR tools announced at the keynote: RealityKit, a toolkit that incorporates rendering, camera effects, physics, and more for developers who are just starting to make AR apps; Reality Composer, a developer app to create AR scenes with a drag-and-drop interface; and ARKit 3, which introduces object and image detecting, motion capture, and the impressive “People Occlusion” to automatically composite people into AR scenes.
What it means: ARKit 3’s position detection, which allows the camera to provide coordinates for human heads, trunks, and limbs is a particularly notable advancement that should open up AR possibilities for more types of development projects.
What to do now: When scoping out new projects, start identifying areas where AR can improve user experience without creating usage barriers.
Takeaway 5: Sign In with Apple—sorry, Google and Facebook
Single-sign-on is a really easy way to log into 3rd-party apps and capabilities—and while you’re at it, give away all sorts of personal information. Further staking out their position as the guardians of privacy, “Sign In with Apple” works like those other authenticators but doesn’t reveal any personal user data. Apple will even generate a unique, anonymous email address (which forwards messages to the user) when the app requests it.
What it means: Getting the user information brands want will be somewhat harder. Depending on the app, this new sign-in may also hinder existing functionality, since “Sign In with Apple” will be required if other 3rd-party authenticators are supported.
What to do now: Prepare to deliver the experience your users expect with as little personal data as possible. Additionally, if you use federated login from another service, ask your team what it will take to add “Sign In with Apple.”
Takeaway 6: watchOS 6: It’s complicated
After the obligatory reminder that Apple Watch is the best-selling smart watch ever, drums rolled and Apple announced watchOS 6! In addition to the fancy new faces and more health monitoring tech, there were a few significant pieces of news for 3rd- party developers: the App Store is coming to Watch, so users don’t have to get their phone out to download your app. Similarly, developers can now create standalone apps for Watch. New complications were also introduced, including noise and wind.
What it means: iPhone independence makes Apple Watch more of a standalone device and may entice developers to reconsider what can be done to take advantage of features like haptic alerts. For example, a new Noise app monitors ambient noise and alerts the user when levels cross a dangerous decibel threshold.
What to do now: Take another look at ways to take advantage of the watch’s unique monitoring, haptics, and proximity attributes (i.e., the fact that it’s being worn). As part of the Watch app development process, continue thinking about ways to incent users to put your complication on their watch face. This may become the key to greater engagement.
And with that, we’re signing off. Continue to look to WillowNotes for more of our take on the development implications of other industry events.