Apple Quietly Reveals iOS 13 Developer Tools

Apple's WWDC keynote showed the company and it's polished best. The 2-hour, 15-minute extravaganza was heavy on user-facing updates to iOS for iPhones, watchOS for wearables, macOS for laptops and desktops, tvOS for Apple TV, and the brand new iPadOS for the iPad. Apple made little mention on the stage of the underlying SDKs and APIs that will make all the magical new features a reality, but the iPhone maker was sure to share them in developer sessions later in the week. 

“The new app development technologies unveiled [this week] make app development faster, easier and more fun for developers, and represent the future of app creation across all Apple platforms,” said Craig Federighi, Apple’s senior vice president of Software Engineering, in a prepared statement. “SwiftUI truly transforms User Interface creation by automating large portions of the process and providing real-time previews of how UI code looks and behaves in-app. We think developers are going to love it.”

SwiftUI was top-most among the revised tools. SwiftUI is Apple's user interface Framework for creating app UIs. The latest version of Swift banks on Xcode 11 so developers can quickly and easily assemble UI components across Apple's platforms. Apple pointed out that SWIFT code is generated automatically, so any changes are populated downstream in an instant, powering real-time views of user interface changes as developers test their code. Moreover, developers can run previews directly in connected Apple devices, such as the iPhone or iPad. The critical factor here is that SwiftUI relies on the same API built into iOS, macOS, watchOS, tvOS, and iPadOS. 

Like Microsoft and, to some extent, Google, Apple is looking to bridge the divide between iOS apps and macOS apps. That's why Xcode 11 has a simple checkbox that lets developers automatically add basic macOS functionality (including windowing features) to their iOS app. Apple is handling all the heavy lifting in the background, such as porting in Platform elements such as keyboard and mouse support. This gives developers a massive headstart on building native macOS apps. Apple hopes that by automating the backend code, developers will have more time to focus on what makes their app unique. 

ARKit 3 is among the latest batch of toolsets from Apple, and it makes some truly amazing leaps forward. For example, People Occlusion allows AR environments to account for people moving throughout the space. This means people can walk around (behind and in front of) virtual objects. ARKit 3 also makes it possible to track up to three faces with the front camera, as well as use the front and rear cameras at the same time. Apple debuted the RealityKit Swift API, which it says is a new way for developers to prototype and produce AR experiences for the iPhone and iPad — even if the developers have no experience creating in 3D.

Apple updated Core ML and Create ML to bring more smarts to apps. Core ML 3, for example, adds more than 100 model layers that apps can take advantage of when applying Machine Learning and computer vision. Core ML 3 is also better able to understand natural speech and language, allowing people to interact with the app more seamlessly. Developers will be able to use machine learning models on the device itself, rather than in the cloud. Apple says this will lead to more personalization within apps. 

watchOS 6 introduces the App Store on the Apple Watch. This is a huge opportunity for developers, who will now be able to surface their watchOS apps directly on the wearable itself rather than on the iPhone. Apple says developers can tap into the Apple Neural Engine using Core ML to create intelligent on-the-wrist apps. For example, a new streaming audio API will let people stream audio from whatever app they wish directly to the Apple Watch. An extended runtime API will give apps more time to accomplish tasks in the background even when the user moves on to other activities on the watch itself. This API will have access to sensors, such as the heart rate monitor, location, and accelerometer/gyroscope. 

Then there's Apple ID login. Apple is countering Google and Facebook's offerings with its own user Authentication tool. With iOS 13, iPhone and Mac owners will be able to use their Apple ID (rather than Google or Facebook) to sign into and create app login profiles. Apple's approach is different from its competitors'. Where Google and Facebook are all too happy to scrape user data (email, phone number, street address), Apple anonymizes these data points to protect the end user. Developers will still be able to reach out to iPhone owners, but the email address will be a random string of letters and numbers rather than a normal email address. Critically, Apple will mandate that developers include Apple login with any apps that also support Google and Facebook login. 

Rounding things up: PencilKit makes it easier for developers to add support for the Apple Pencil to their apps; SiriKit adds support for audio apps for tighter Integration with iOS and watchOS, and MapKit lets developers set vector overlays and filter by point-of-interest.

The first developer betas of iOS 13, watchOS 6, macOS Catalina, iPadOS, and tvOS 12 are already available to registered developers. More information is available via Apple's developer portal

Be sure to read the next Application Development article: Firebase Announces Platform Upgrades at Annual Summit