MacTalk

May 2024

22 comments

Examining Apple Intelligence

Whoever at Apple came up with the term “Apple Intelligence” must be pleased with themselves. It’s a clever way of creating a distinction between what Apple is doing with AI and the generative AI chatbots and artbots most people think of as AI.

(I wouldn’t be surprised if there were internal discussions about how “AI” might come to mean “Apple Intelligence,” but that’s no more going to happen than when Apple thought it could capture the word “email” with the “eMail” feature of the eWorld online service. True story: when my late friend Cary Lu was writing a book about eWorld in the mid-1990s, Apple told him that was the hope.)

It’s worth emphasizing that Apple Intelligence will be rolling out slowly“over the course of the next year.” Don’t expect to see everything discussed below in September with the initial releases of iOS 18 and macOS 15 Sequoia, and we could be well into 2025 before some of the more compelling enhancements to Siri arrive. Patience, grasshopper.

Apple Intelligence Versus AI

What differentiates Apple Intelligence from AI? Apple calls Apple Intelligence a “personal intelligence system,” and indeed, most of the features the company unveiled during its WWDC keynote revolve around your personal data. Apple’s insight—which fits neatly with the company’s focus on privacy and empowering the individual—is that many AI systems suffer because all they know about you is what you tell them in your prompts. Generative AI chatbots and artbots are trained on a massive corpus of material, and they return results that are statistically probable. But as individuals, we are anything but statistically probable. We are our data: our email, our messages, our photos, our schedules, our contacts.

The problem with Apple’s homegrown tools—notably Siri—focusing on personal context is that people will ask questions that can’t be answered with local information. When Siri determines it can’t respond well to a query requiring global knowledge, it will offer to pass the question off to ChatGPT, which is also available within Apple’s systemwide writing tools. ChatGPT integration is free, but ChatGPT Pro subscribers can connect their accounts—I’m uncertain why this would be a help at the moment. Apple said it plans to support other AI chatbots in the future, such as Google’s Gemini. Among the chatbots I’ve tested, I’ve had the least success with Gemini and the most with ChatGPT and Claude.

Apple Intelligence Privacy

It’s reasonable to worry about how these features will impact your privacy. Apple repeatedly emphasized the pains it has taken with Apple Intelligence to ensure user privacy. The AI-driven features can take one of three paths:

  • On-device: Much of what Apple Intelligence does will be handled entirely locally, never leaving your device in any way. That’s why the system requirements for Apple Intelligence are so steep—an iPhone 15 Pro or iPhone 15 Pro Max with an A17 Pro chip, or iPads and Macs with an M-series chip. Apple Intelligence’s processing and memory requirements are such that lesser chips aren’t sufficient.
  • Private Cloud Compute: Some Apple Intelligence tasks—the company hasn’t said which—exceed the capabilities of even Apple’s latest chips. For such tasks, Apple has built a server system called Private Cloud Compute that relies on custom Apple silicon and a hardened operating system designed for privacy. Private Cloud Compute receives only the data necessary to complete the task and discards everything after completion. Could Private Cloud Compute eventually be used for Siri requests made from HomePods, Apple TVs, and older devices, or will Apple use the system requirements for Apple Intelligence to encourage upgrades?
  • ChatGPT: Apple can’t make the same privacy promises with ChatGPT as it can for on-device and Private Cloud Compute processing, but it said that our devices’ IP addresses will be obscured and OpenAI won’t store requests. OpenAI does use content from individual accounts (not business offerings) to train its models, although you can opt out of that. It’s unclear if or how you can opt out of OpenAI training on content submitted through the Apple Intelligence integration.

Apple Intelligence Features

Apple Intelligence is an umbrella term for three classes of features surrounding language, images, and actions. Language-related features include system-wide writing tools, categorization and prioritization of email messages and notifications, and transcription of recordings, phone calls, and voice memos, along with Siri’s improved understanding of natural language. Image-focused features include Genmoji, the image-generation tool Image Playground, and advanced editing capabilities in Photos. What Apple calls “actions” mostly seem to involve enhancements to Siri that enable it to perform specific tasks, even across multiple apps.

Language Tools

The most prominent of the Apple Intelligence language capabilities may be Apple’s new systemwide Writing Tools, which will be available in both Apple and third-party apps. They’ll help you proofread, rewrite, and summarize your text along the lines of what Grammarly does today.

I’ve relied on Grammarly for years for proofreading. It catches typos, doubled words, and extra spaces, and its newer AI-powered features sometimes make helpful suggestions for recasting awkward sentences. I’m slightly annoyed that Grammarly’s proofreading tools are so helpful, but it’s challenging to edit your own text to a professional level, and Grammarly can identify errors much faster than I can. Don’t assume that tools like Apple Intelligence’s proofreading capabilities for helping with grammar, word choice, and sentence structure are necessarily a crutch. They may be for some people, but even people who care about their writing can still benefit from some suggestions while ignoring unhelpful ones. (For instance, Grammarly is allergic to the words “own,” “actually,” and “both,” but when I use them, I do so intentionally.)

It’s easier to question Apple Intelligence’s rewriting and composition capabilities (the latter of which rely on ChatGPT), but you’ll notice that most of those doing so are professional writers who don’t need them. Recall my point from “How to Identify Good Uses for Generative AI Chatbots and Artbots” (27 May 2024) that AI is useful primarily when your skills and knowledge wouldn’t already make you better than a C+ student. I do like how Apple provides three tones: friendly, professional, and concise. Less experienced writers often have trouble maintaining a consistent tone, and untold misunderstandings and hurt feelings could be avoided if people took tone advice.

Nonetheless, I’m somewhat dubious about Mail’s Smart Reply feature. Although its Q&A design answers another of my criteria for good uses of generative AI (that you must be willing to work with an assistant), it’s not clear that it would be enough faster to justify using, especially if you had to edit what it wrote to sound like something you would have sent.

Apple Intelligence’s summarization tools are spread throughout the system, and some feel like Apple is throwing spaghetti at the wall to see what sticks. Summaries seem to be associated with at least these features:

  • Text you’re writing: This seems most useful when you need a summary for a blog or social media post.
  • Notifications: If you get so many notifications that you need a summary, you may be better served by taming notifications from overly chatty apps.
  • Web pages saved in Safari Reader: Given that Reader is mainly used for pages that are too long to read immediately, summaries (along with tables of contents) could be helpful.
  • Long messages in Mail: Most email messages aren’t long enough to justify a summary, but summarization could be a boon for quickly parsing long email conversations.
  • Busy group threads in Messages: It’s hard to imagine a sufficiently involved text group thread that wouldn’t be easier (and safer) to read in its entirety, but perhaps I’m not the target audience.
  • Message list summaries in Mail: Replacing the first few lines of message text shown in the message list with a summary seems like an unalloyed win.
  • Transcripts of recordings from Notes and Phone: Given the loose nature of recorded text, transcript summaries may be particularly useful for quickly understanding a talk or call.

The final language tools evaluate the content of notifications and email messages to prioritize which to show you first. I can’t quite imagine how that will work for notifications, but prioritizing email messages should prove popular. Also, a Reduce Interruptions Focus will show you only notifications that need immediate attention. That may seem like a nice in-between option between allowing everything and turning on Do Not Disturb, but it will make the Focus feature even more unpredictable (see “Apple’s New Focus Feature May Be Overkill,” 20 January 2022, and “Notifications Unexpectedly Silenced? Blame Focus,” 17 February 2023).

Image Tools

Apple Intelligence’s image tools span the gamut. The Image Playground app (the features of which will also be available in some apps) will let you create original images from text prompts, much like other AI artbots. Apple said we’ll be able to choose from three styles: Sketch, Illustration, and Paint. That ensures that no one will be using Image Playground to make photorealistic deepfakes. I’m also confident that Apple will put significant boundaries on what Image Playground can produce—I can’t imagine it generating NSFW images, images of celebrities, or anything with trademarks, for starters.

Genmoji, which are AI-generated custom emoji-style graphics, may be more interesting for those who find emoji amusing but have trouble going beyond a few smileys. Often, when I think about using an emoji as an emotive emphasis to something I’ve written in Messages or Slack, the image I desire doesn’t exist. How else will I get a sunglasses-wearing penguin on a surfboard to express my enthusiasm for a suggested outing? Some worry that Genmoji will lack the shared meaning of the limited set of emoji we have now, but most of those shared meanings exist only among subsets of the population as it is, so it’s hard to get upset about this.

The Image Wand feature of Notes, which turns rough finger or Apple Pencil sketches into more polished drawings, has taken some flak online partly because Apple’s demo shows a perfectly passable sketch being “improved.” The criticism here would seem to fall under the same category as professional writers complaining about writing tools—it’s easy to carp if you have illustration skills. As someone who couldn’t draw his way out of a paper bag (or even draw the bag itself), I’m intrigued to see if Image Wand can make sense of anything I sketch. Nonetheless, I don’t see myself using it purely because I rarely sketch anything. I’d far rather write a thousand words.

The three remaining image-related features of Apple Intelligence are in Photos:

  • Descriptive searches: Photos has allowed us to search for objects—cat, beach, airplane—for some years, thanks to capabilities Apple previously described as “machine learning.” With Apple Intelligence, we’ll be able to search for photos using natural language: “Tonya running in a race” or “sunsets over our pond.” Once we become accustomed to the feature, I believe many of us will use it heavily.
  • Clean up background objects: Generative AI will also give Photos the capability to remove background objects from photos. (The generative part involves filling in the background seamlessly.) Those who spend a lot of time on their photos but don’t already rely on a more capable editor like Photoshop will undoubtedly appreciate the option.
  • Custom memory movies: We’ve already hit the “infinite photo” inflection point where it’s difficult to make sense of our burgeoning photo libraries. When you have tens or hundreds of thousands of images, extracting a set that’s representative of something is daunting. Generating custom memory movies with a text prompt could be compelling. I’d like to see Apple open this feature so the movies could be created and viewed on the Apple TV.

Actions, or Siri Gets a Brain

For many people, giving Siri an AI-driven brain may be the main appeal of Apple Intelligence. Although Siri was initially impressive for its time, and Apple regularly expanded Siri’s capabilities and knowledge, it seems to have been degrading over the past few years, a la Flowers for Algernon.

Most importantly, the new AI-driven Siri will have a richer language understanding and be able to maintain context within a conversation so each command won’t have to stand on its own. It should also be much more forgiving of the verbal stumbles we all experience at times.

Apple is making a big deal of Siri being aware of your personal context and on-screen content. That should enable it to find your content across Mail or Messages, extract information directly from Notes, and take action using content you can see on the screen. Its capabilities will span multiple apps, enabling you to ask Siri to enhance a photo and then attach it to a Mail message. I’m unsure how successful these features will be. Siri can do a lot right now, but because you have to know precisely what it can do and phrase the command exactly right, almost no one takes full advantage of Siri’s capabilities. It won’t take many failures—“I’m sorry, Dave. I’m afraid I can’t do that.”—before people give up on Siri again.

Because Siri will work locally on your devices, its knowledge base must be limited. In a clever move, Apple will be giving Siri knowledge about its products, so you can ask for help using your iPhone, iPad, and Mac. I’m looking forward to trying this because it can be tricky, even for people like me, to remember what any given setting does and where Apple has hidden it. (Real-world example: Why do some iPhone users see emerald rather than lime green for green-bubble friends in Messages? Because of turning on Settings > Accessibility > Display & Text Size > Increase Contrast.)

As I noted before, when a Siri query needs access to global knowledge, it will offer to send the question to ChatGPT. While that may work well for many queries, we’ll see if Apple implements it so we can maintain fluid conversations. The main problem is that ChatGPT’s knowledge is time-limited. A more satisfying approach might work along the lines of Perplexity, which performs a search and builds a response based on the summary of what it found. I could even imagine Apple moving in that direction more generally as a way of weaning itself from search-engine reliance on Google, though that would also mean giving up the billions in revenue it gets from Google.

How Smart Will Apple Intelligence Be?

There’s no question that Apple was pushed into creating Apple Intelligence. Many of its features would have worked their way into the company’s apps and operating systems over time, but the hype—some deserved, some not—surrounding AI from other tech giants forced Apple’s hand. Remember, ChatGPT only came out in late 2022, and it was months before anyone could have predicted how AI would have taken the online world by storm. Apple hasn’t had much time.

That may account for why Apple Intelligence feels like a grab bag, especially in bolted-on bits like the ChatGPT integration. Some features, such as Image Playground and Smart Reply, feel as though Apple is checking boxes to compete with existing tools. Others will be compelling, such as descriptive searches in Photos. All many people need from AI is for Siri to become less hard of understanding.

Although Apple may be behind the curve in making these features available, the company seems to have approached the architectural questions seriously. On-device processing is important for both performance and privacy reasons, and Private Cloud Compute could set a new standard for what people demand from server-based AI tools.

As far as I can tell, Apple Intelligence won’t be treading on anyone’s lawn. If you don’t want to use it, just ignore it, like all the other features that aren’t relevant to how you prefer to use technology. But I have talked with people who find Apple Intelligence some of the more exciting work Apple has done on the software side in years. Apple’s hardware has hit astonishing levels of performance, but the software hasn’t given most people new capabilities that are possible only because of that processing power.

We live in interesting times, and they may become more interesting in the next six to twelve months.

Upcoming Events

Click for special offers to Mac Users Groups members.

Website design by Blue Heron Web Designs

Click the image for information about advertising on this website.