The daily for
New Zealand’s Startups

Apple Intelligence: what it means for startups and app developers

The debut of Apple Intelligence at the recent Worldwide Developers Conference marks a significant shift in artificial intelligence and app development given the tech giant’s dominance in the app economy.


Peter Griffin

Xero's Rod Drury

“We knew Apple was acquiring a lot of AI technologies,” says Xero cofounder and consummate Apple fan Rod Drury.

“What they have done with their personal AI, where it can mine your messages and emails and then you can ask the question ‘What time is mum’s flight coming in tomorrow?’ that feels really cool. That’s going to touch everybody,” adds Drury, who stepped down as a non-executive director of Xero in August 2023, but who is still actively involved in product development at the ASX-listed cloud accounting software maker founded in Wellington.

During its Worldwide Developers Conference [WWDC] demos, Apple primarily focused on improvements Apple Intelligence will bring to its own apps, such as offering the ability to summarise email notifications in the Mail app, create personalised “genmojis”, and help edit and summarise documents. 

A partnership with OpenAI will also allow Apple users to draw on ChatGPT’s large language models (LLMs) for AI-generated answers when there’s a need to go beyond the device for further information. Apple has created private cloud infrastructure based on its own M-series high-performance computer chips, to securely process workloads that can’t be completed on Apple devices. 

Apple has its own generative models

But the on-device AI is the culmination of Apple’s under-the-radar efforts to respond to the generative AI movement, dominated to date by OpenAI, Microsoft, Google and other LLM makers.

“First and foremost, Apple highlights the existence of its proprietary AI models, marking its entry into the generative model arena,” Counterpoint Research’s Wei Sun points out.

“Unlike Google and Microsoft/OpenAI, Apple doesn’t have a frontier LLM. However, it has developed a suite of small and medium generative models, including a three billion parameter on-device language model, and a larger server-based model accessible via Private Cloud Compute on Apple silicon servers.”

Apple’s big privacy pitch is that even it can’t see the data on your phone or in its private cloud, and Apple Intelligence users’ data won’t be harvested to train OpenAI’s model. 

WWDC is a developers’ conference, and the majority of the audience was there to find out how they can harness Apple Intelligence to offer better and more feature-rich apps in the App Store, which generated an estimated US$70 billion in app-related sales in 2022.

At the conference, various Apple executives outlined three key ways developers can take advantage of Apple Intelligence: 

Enhanced privacy and security for third-party apps

On-device processing will allow more data to be drawn on in third-party apps to serve up content on late-model iPhones, iPads and Macs. For instance, if you have health tracking data in a health app, gathered from wearable devices, analysis of the data could be done on the device, with optional uploads to the cloud for permanent storage.  

The fact that user data does not need to be sent to external servers will help app makers build trust with users, particularly in sectors where data privacy is paramount, such as healthcare and finance.

An AI-powered UX upgrade

Apple Intelligence will enable developers to create more intuitive and responsive apps. By processing data on-device, apps can offer real-time interactions and personalised experiences without the latency associated with cloud-based AI.

Developers can now integrate advanced features such as natural language processing, image and speech recognition, and predictive analytics. These capabilities can enhance the functionality of apps.

Apple Intelligence’s ability to retrieve content from within app menus and to identify what is on the device screen will allow for more complex tasks to be completed within apps. An AI upgrade to Apple’s Siri voice assistant could finally make it useful for retrieving information or automating tasks within apps.

Apple has added “semantic indexing” to Siri to help understand users’ intentions. It means that Siri can organise a semantic index library based on the phones, calendar, emails, and files on a user’s device. Through the new App Intents API, Siri will be able to perform more complex tasks across multiple apps.

Cost efficiency

By reducing the reliance on cloud services for AI processing, start-ups can potentially lower their operational costs related to data transfer, storage and processing. This cost-saving can be particularly beneficial for early-stage start-ups with limited budgets. New AI development tools also promise to speed up the app development process in the Apple ecosystem too.

A digital dilemma

Drury says Apple Intelligence could mark a shift in how we access and use every-day data generated in email, messaging and photo apps.

“My big dilemma is that I’ve managed to avoid all of those mail clients and just have Gmail on my phone where I’m able to get a picture of my email and not have to download everything. But now we’ve got this opt-in decision of whether we want to go for personal AI,” he says.

With a solid reputation for security and privacy already cemented, many iPhone users will be happy to continue transferring data to iCloud and Apple’s other cloud-based platforms. But users will now have the option of keeping data close to hand.

While Apple Intelligence represents an opportunity for developers to produce more compelling apps, Drury says the shift towards on-device AI processing, which is also happening in the Windows world with the arrival of Microsoft’s Copilot+ PCs, also opens up options for local cloud infrastructure providers.

“I remember the Pacific Fibre days, we were always 100 milliseconds from the States. A lot of financial apps need that superfast speed, but now, with training for AI, the biggest constraint is a need for renewable, green electricity,” says Drury, who was one of the backers of efforts nearly a decade ago to build an additional undersea fibre optic link between the US, Australia and New Zealand.  

With a high proportion of our electricity generated from renewable sources, and large language models requiring intensive data centre usage, New Zealand is a prime location for training LLMs.

“So there’s the direct link between what’s happening in AI and what’s happening here in New Zealand where that 100 milliseconds doesn’t matter,” says Drury.

How deep should developers go?

Still, app developers in the iOS world will need to temper their desire to jump headlong into embracing Apple intelligence. 

It will involve the need to familiarise themselves with the new tools and frameworks associated with Apple Intelligence. This may require additional training and adaptation, especially for those who are accustomed to cloud-based AI solutions.

While on-device AI offers many benefits, it is also constrained by the hardware capabilities of the device. Only the iPhone 15 Pro, and 15 Max are currently compatible with Apple Intelligence, cutting off the vast majority of iPhone users from tapping into on-device AI. A wider range of late-model Macs, MacBooks, and iPads will run the services, but it's unclear exactly when they will be available, with Apple pursuing a cautious staged rollout of features stretching into 2025. It will be a few years before a critical mass of Apple users have upgraded to more powerful devices capable of running Apple Intelligence.

Don’t be made irrelevant

“More startups killed off, and more mature ones, this time,” was FranklyAI founder Matt Ensor’s initial take on Apple Intelligence.

“Only once AI is on your phone can you adopt AI into your life, so 'new Siri' will be another milestone for GenAI and humanity,” he added.

Many have pointed out that Apple intelligence and even more standard app updates that will feature in iOS later this year, will kill off existing standalone apps. 

The text editing features coming to iPhones will challenge the likes of Grammarly. It’s a good reminder of the “flashlight app” effect. The early iPhone models didn’t have the ability to use the phone’s camera flash as a torch. Dozens of third-party apps emerged offering this functionality. Then in a single iOS upgrade, Apple introduced the flashlight feature killing off the flashlight app economy.

Developers upbeat about the potential

In the meantime, it will be business as usual for many app developers, who are already sending requests to cloud-based LLMs from the smartphone-based apps they create.

Apple’s AI moves have generated voluminous discussion on Hacker News, the chat forum of famed start-up incubator YCombinator.

“It's not just the raw AI capabilities from the models themselves, which I think many of us already get the feeling are going to be commoditized at some point in the future, but rather the hardware and system-wide integrations that make use of those models that matters starting today,” wrote Bologna, Italy-based software engineer Spencer Scorcelletti.

“Obviously how the experience will be when it's available to the public is a different story, but the vision alone was impressive to me. Basically, Apple again understands the UX.”

Drury agrees that Apple has pointed the way forward for how artificial intelligence can be used.

“After the Apple keynote, things have gotten really interesting in the AI space.”


Peter Griffin

Peter Griffin is a Wellington-based technology and science writer, media trainer, and content specialist working with a wide range of media outlets and tech companies. He co-hosts The Business of Tech podcast for BusinessDesk and is the New Zealand Listener's tech columnist. He has a particular interest in cybersecurity, Web3, biotech, climate tech, and innovation. He founded the Science Media Centre and the Sciblogs platform in 2008.

6 hours ago

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
6 hours ago

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Suspendisse varius enim in eros elementum tristique. Duis cursus, mi quis viverra ornare, eros dolor interdum nulla, ut commodo diam libero vitae erat. Aenean faucibus nibh et justo cursus id rutrum lorem imperdiet. Nunc ut sem vitae risus tristique posuere.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.