What’s new in iOS 13

by

On September 19, 2019, Apple released the latest major update to its iPhone and iPad operating systems: iOS 13. This iOS version introduces several significant changes that developers will need to be prepared to handle going forward. In this post, I’ll give a quick overview of the most impactful new features, and then dive into two especially important updates: dark mode and scene sessions.

iOS 13 Updates: Overview

The iOS 13 features that Apple has specifically promoted to date include:

  • Voice control: Apple’s groundbreaking new voice control interface assigns numbers to screen sections, and meaningful names to selectors, making it possible for the first time to manipulate every function of the iPhone using only vocal cues. 
  • ARKit 3: Updates in ARKit 3 include the introduction of People Occlusion, which will allow developers to show virtual content in front of or behind real people, as well as advanced motion tracking, multiple face tracking, and collaboration features.
  • CoreML improvements: In a notable leap, iOS 13 will allow developers to use CoreML to train neural networks within their apps, rather than simply integrating Apple’s pre-trained models. 
  • SiriKit improvements: The most prominent update to SiriKit allows third-party apps (e.g., Spotify, Audible, etc.) to use Siri for media playback; previously, playback commands only worked with native Apple apps such as Music, Podcasts, and Books. 
  • SwiftUI: This is a new framework, similar to React Native, that can be used to create user interfaces declaratively with Swift.
  • Sign in with Apple: Apple is actually a little late to the game with this feature, which (apart from the integration of biometric authentication options such as FaceID and TouchID) is conceptually identical to already-familiar options like “Sign in with Google” and “Sign in with Facebook”. 
  • Dark mode and Scene sessions, which I’ll discuss below. 

Dark mode updates in iOS 13

For several years, iOS has offered UI customization methods that allow developers to implement their own color themes for individual apps. Before iOS 13, however, developers had to manually implement custom color theme transitions. Now, with the introduction of Dark Mode, users have an option to turn on the dark color theme system-wide, and third-party apps can be designed to piggyback on these transitions.

The iOS 13 Dark Mode toggle

To take advantage of Dark Mode, developers will need to use semantically-defined colors. For example, the color label systemBlue has an RGB value that’s dependent on the mode (light or Dark), and will automatically switch colors when a user toggles the dark mode settings. In Xcode 11, you can select Apple’s system colors from the color picker; you can also define additional mode-dependent colors to use within your own app. 

One word of caution — with the update to iOS 13, all native UIKit colors will also be updated to system colors, and will automatically switch between Light and Dark modes. As a developer, you need to make sure you’ve accounted for these transitions in your app. For example, if you’ve added a standard date picker on a white or light grey background, without overriding the system colors, the dates will become invisible when the standard color switches to white in Dark Mode. 

This won’t work in Dark Mode!

You can opt out of the  automatic UIKit Dark Mode transitions by using the UIView property overrideUserInterfaceStyle to specify that your app should always stay in light (or Dark) mode. You can also implement traitCollectionDidChange to listen for mode changes between dark and light modes (this works in UIView, UIViewController, UIPresentationController), so that you can update your app UI appropriately; e.g., in the example above, you could update the date picker background to a darker color when Dark Mode is turned on. For more information, take a look at the WWDC Video Implementing Dark Mode on iOS.

Scene sessions in iOS 13

Scene Sessions are one of the most exciting new features for the iPad in iOS 13 (aka iPadOS). Previously, it was impossible to have more than one instance of your app’s UI on screen (with the exception af Widgets running as another process with limited functionality). In iOS 13, Apple presented a change in the paradigm for multitasking apps by allowing developers to abstract the app’s UI in Scenes

This is a fundamental paradigm shift on the development side. Although not much changes from the user’s perspective, you’re now dealing with only one app instance, with two scene sessions that share the same memory space and process instance. This makes it much easier to share information across screens. 

The lifecycle of a scene session is very similar to the lifecycle of an iOS app, but instead of “launching” and “un-launching” the app as a whole, we have the new concept of “attaching” and “un-attaching” scene sessions using a new type of delegate, the UISceneDelegate:

Scene Sessions are optional for the time being — you don’t have to implement them in new or existing apps, although doing so could prepare your app to take better advantage of the new features in iPadOS. However, Apple is known for introducing changes in their APIs and later enforcing them during the app review process. 

For more on what’s new in iOS 13, check out the WWDC 2019 videos. We’re excited to work with these new features at Grio — if you’re interested in building or updating an app for iOS 13, feel free to contact us!

Building a modular iOS app

by

At Grio, we’re often asked to improve our clients’ existing web and mobile apps — fixing problems, adding features, etc. — but many of our most interesting and exciting projects involve building a brand-new app entirely from scratch. I’ve had that opportunity on a recent project, which means that my colleagues and I have been thinking through some of the foundational decisions that can really only be made when you’re starting fresh. One of those decisions is monolithic vs. modular. 

You Need a Supervisor

by

Several of our folks recently attended ElixerConf in Colorado, where Grio’s John Palgut gave a lightning talk on protecting your app from crashes by using a Supervisor – enjoy!

 

Test, Never Trust: Dealing with external services when using Elixir and Phoenix

by

In his short story Thin Cities 3, author Italo Calvino describes a city reduced to its plumbing — a network of pipes, stripped of the streets, walls, and floors that would ordinarily conceal them. I like to picture this “thin city” when I’m testing software; diving beneath the superficial layers to probe the essential connections that keep information and experiences flowing. 

Fostering early collaboration between development and design

by

User-friendly software doesn’t happen by accident — the best products are designed intentionally, thoughtfully, and thoroughly before implementation begins. However, that doesn’t mean that developers shouldn’t play a role in the early stages of a project. In this post, I’ll talk about why you should bring your developers into design discussions and reviews, and recap a successful design-development collaboration on one of Grio’s recent client projects. 

Streamlining mobile development with CI and CD

by

Continuous integration (CI)  and continuous delivery (CD) have significantly improved both my productivity as a developer and my team’s ability to execute smoothly and efficiently on a variety of projects. In this post, I’ll explain how CI and CD work, talk a bit about the benefits of these practices, and walk through an example that illustrates how to set up your own CI/CD systems. 

Measuring software quality

by

The word “quality” first appeared in the English language around 1300. Technically, “quality” is a neutral term, referring to the character or nature — good, bad, or otherwise — of a person, place, or thing. However, when we use this word today, we’re often implicitly pointing to high quality. Most modern definitions of “quality” indicate that the term is connected to attributes like lack of risk, ease of trust, superiority to competition, and high value.