Getting your iOS apps ready for Apple Intelligence

This year, Apple Intelligence will be available in beta. It's a set of artificial intelligence features that Apple will add to macOS, iOS, and iPadOS devices to offer consumers a more personalised experience designed to help improve their efficiency and productivity.

There's a lot of excitement around what Apple Intelligence will bring to developers, consumers, and the ecosystem as a whole. You can read more about the possibilities our CEO believes it will unlock here.

In this post, I’ll focus on the changes Apple Intelligence will bring to Apple’s virtual assistant, Siri, and how you can prepare your iOS app to take advantage.

First, let’s look at how Apple Intelligence is likely to change Siri.

Giving Siri a more personal touch

Apple Intelligence will enable Siri to tap into personal context from device activities to make interactions with the virtual assistant feel more conversational and natural.  These new capabilities, which will start rolling out in beta this year, will see Apple’s 37 year old vision of a capable and context-driven virtual assistant come to life. 

To make this possible, Apple is enabling third-party iOS apps installed on its devices to offer actions and search content to Siri through its App Intent Framework

Evolution of Apple's App Intent Framework

During the Worldwide Developers Conference in 2021, Apple first introduced Intents, a tool for sharing repeatable actions through Shortcuts. Since then, the company has continued to expand the App Intent Framework to deeply integrate app actions and content with Siri, Spotlight, widgets, controls, and more.

In 2024, Apple introduced the ShowInAppSearchResultsIntent, allowing users to jump to results based on specific search criteria. Currently, this capability is limited to testing through the Shortcuts app. With Apple Intelligence on the horizon, we’re excited to see how it will evolve Siri, particularly now that content and results can be generated through third-party iOS apps.

Getting your app ready for Apple Intelligence

To ensure your iOS app works smoothly with the Apple Intents Framework and delivers the expected results, In this post, I will guide you through everything you need to do to ensure your iOS app works seamlessly with the Apple Intents Framework, including how to ensure quality releases with automated tests via Bitrise's ephemeral CI build environment.

Let's dive in. 

Testing your intents with UI tests

To ship quality iOS and Android apps, you will need to validate your App Intents with end-to-end testing. This is a crucial step to ensure you check the functionality of your app and include this testing as part of your Continuous Integration validations. 

It's important to note that unit tests with AppIntent subclasses with mocks are not considered end-to-end tests. If you use this approach, you cannot guarantee that your intents will work as expected via Siri or Shortcuts.

Fortunately, there is a better way. By writing UI tests for end-to-end flow testing, you can ensure seamless integration between your Intents and the system. I will demonstrate the process using an official WWDC24 sample project

Writing UI test for Intents

In the following example, I will cover multiple aspects of the testing process, including:

  • Registering your application's Intent: including validating that iOS registered your Intent correctly and can handle the defined search parameters.
  • Triggering your Intent using the Shortcuts app: To validate the whole flow, I will use the Shortcuts app, which uses the Intents framework just like Siri and Spotlight.
  • Providing the search parameters: The input fields will appear as popups rendered by the Springboard application
  • Verify that your app provided the correct results to serve the triggered Intent.

1. Registering your application’s Intent

// 1
addUIInterruptionMonitor(withDescription: "Photo Library Permission") { alert in
    alert.buttons["Allow Full Access"].tap()
    return true
}

// 2
let testApp = XCUIApplication(bundleIdentifier:"com.example.apple-samplecode.AssistantSchemasExample")
testApp.launch()

// 3
wait(2)

testApp.terminate()

The sample application needs to access the photo library, which happens after every installation or Simulator reset. For this purpose, I will register // 1 an interruption monitor, which will handle the photo library access popup.

The system-wide registration of your Intents happens when we launch and terminate the app while waiting a few seconds(// 3).

2. Triggering your Intent

// 1
let shortcutApp = XCUIApplication(bundleIdentifier:"com.apple.shortcuts")
shortcutApp.launch()

// 2
shortcutApp.buttons["AssistantSchemasExample"].firstMatch.tap()
shortcutApp.buttons["Search"].firstMatch.tap()

The next step will be to use the sample app's search intent in the Shortcut app.

You can launch the Shortcut app with its bundle identifier: com.apple.shortcuts(//1). During the UI test run, we can navigate to the AssistantSchemasExample section(//2), where the search button represents the search Intent we are looking for.

3. Interacting with the intent

// 1
let springboardApp = XCUIApplication(bundleIdentifier:"com.apple.springboard")


// 2
springboardApp.textFields["[SearchMediaIntent] Criteria"].firstMatch.typeText(userInput)

// 3
springboardApp.buttons["Done"].firstMatch.tap()

Springboard(//1) will present the search parameters for the Intent. Entering the search input is easy: Look up the input field using the placeholder value(//2).

Next, we can search for photos taken in Hvolsvöllur. Clicking on the Done button will run your Intent, opening your application to show the search results(//3).

4. Verify the correct behaviour

To verify the whole flow, we must ensure that App Intent opens the correct search result page for the given search parameter. In the example below, we expect a single photo to show up.

// 1
let searchField = testApp.searchFields["Search"].firstMatch
XCTAssertTrue(searchField.waitForExistence(timeout: 5.0))

// 2
XCTAssertTrue(searchField.value as? String == userInput)

// 3
let scrollView = testApp.scrollViews.element(boundBy: 0)
let button = scrollView.buttons.element(boundBy: 0)
button.tap()

// 4
XCTAssertTrue(testApp.navigationBars[locationName].waitForExistence(timeout: 5.0))

Note that the application launch takes time, so we must wait for our main search field to appear (// 1).

We can ensure the search parameter transfer happened correctly by matching the input we've provided using the Shortcuts app (// 2).

Next, tap on the single result image(//3) and verify that the image fits the search criteria by asserting the location displayed in the navigation bar(// 4).

Voila, your automated result is ready!

The approach above will work for your iOS app, too, as your defined App Intents will show up in the Shortcut app, where you can trigger them for testing. You can write multiple end-to-end tests to cover various input combinations using the Springboard-rendered popups.

Make sure these tests are included in your Bitrise Workflows so that Siri can use your App Intents when Apple Intelligence is released!

Apart from your UI tests turning green, you can also visually check the UI test run, using Bitrise’s Remote Access feature. This is how it looks on the Birise builder machine:

Download the sample app with the UI test config

You can download the extended Apple WWDC sample code complete with the above detailed UI tests. Additionally, I have provided the bitrise.yml configuration file so you can easily run this as a Bitrise Workflow.

Get Started for free

Start building now, choose a plan later.

Sign Up

Get started for free

Start building now, choose a plan later.