Apple Intelligence: A Showcase of Apple’s Inelegance
Just made a video about Apple Intelligence in light of the recent news that the more advanced features like in app actions and personal context have been officially delayed. Thought I’d post a link to the video here, but also the video script in case anyone wants to give it a read! Feel free to let me know your thoughts here as well as over on YouTube!
Video Script:
For decades, Apple has built an empire based on simplicity, elegance, and a user experience so seamless that even the most casual tech users could navigate their devices with ease. From the first Macintosh computer to the revolutionary iPhone, Apple has positioned itself as the gold standard of design and innovation. The company’s marketing thrives on one key idea: it just works.
Before going any further, because this is a video that is critical of Apple, I just want to make clear that I am not an Apple hater or a diehard Android user in the slightest. I don’t think I could be further from an Apple hater, actually. I think I own at least one device in every Apple product category. The primary camera for this video is an iPhone 16 Pro. This camera angle is being shot on the M4 iPad Pro. I wear an Apple Watch every day. I have multiple pairs of AirPods. I have multiple Apple TVs and HomePods around the house. I have a bunch of AirTags for my keys, my car, my backpack, and my wallet. I have the new iPhone 16e, which I bought to review, and that video will be coming out soon. (Definitely subscribe as you won’t want to miss that.) And if you’re watching this, I edited this entire video in Final Cut Pro on a MacBook Pro with AirPods Pro. And hell, I even have a Vision Pro! I’m about as pro as you can get in terms of dedication to the Apple Ecosystem.
But Apple Intelligence and its rollout is frankly indefensible, even from an Apple evangelist like myself. What was promised as an elegant, intelligent AI system has instead turned out to be a half-baked, restrictive, and inelegant hodgepodge of features that don’t “just work”. Put very simply: Apple Intelligence is a showcase of Apple’s inelegance. So in this video, let’s look at what Apple promised, what they’ve delivered so far, and why this might be Apple’s most disappointing innovation yet.
Apple Intelligence was announced last June at Apple’s Worldwide Developers Conference and it was presented as a revolutionary step forward. Apple claimed that this wasn’t just another AI assistant or chatbot, but rather a fully integrated, privacy-focused intelligence system that will help users be more productive, creative, and informed. Now, I want to give credit where credit is due, Apple is attempting something that hasn’t been done before, which is to have most of the user prompts and queries processed on the device. Most other AI assistants like ChatGPT and Gemini process all your requests on external servers and then send it back to your device. Apple is attempting to do it differently, and they are processing as many requests as possible on the device itself. This means that many of the features can be used without an internet connection. And Apple’s privacy guarantees are also central to Apple Intelligence. They aren’t storing your queries to train their models, which is a huge limitation that is likely a big contributing factor as to why Apple is so behind: they don’t have all of their users’ data to train the models, unlike Meta, Google, or OpenAI. So I want to commend Apple on setting out to do something ambitious that has never been done before, at least in this way, baked into the OS, and not compromising on user privacy in favor of expediency. And while I commend them for that, there are a lot of other issues that make the rollout of Apple Intelligence an abysmal failure.
Apple Intelligence was introduced at WWDC as being able to do 4 general categories of things:
- Language: Allow you to adjust language and writing with Writing Tools, and do things like summarize and prioritize your notifications.
- Images: Generate images with Image Playground, make personalized custom emojis with Genmoji, edit photos with Clean Up, and make memory movies in the Photos app.
- Actions: Take in app actions; you can ask Siri to, for example, “Play the podcast that my wife sent the other day,” and it will be able to know what podcast you’re referring to and start playing it.
- Personal Context: Allows the system to know things about you; so, for example, you can ask “when is Mom’s flight landing?” And it’ll know from your email, text messages, and even voicemails where that information is, retrieve it, and give it to you.
So that’s just a brief summary of the features that have been announced.
It’s been 8 months since Apple Intelligence was announced last year at WWDC, and which of these features have actually been released? Just the first two categories. The other features, like in-app actions and understanding your personal context, haven’t even been launched in beta. And Apple this past Friday formally delayed the release of these remaining features. Apple’s spokesperson, Jacqueline Roy, said, “It’s going to take us longer than we thought to deliver on these features, and we anticipate rolling them out in the coming year.” Now, Apple chooses the language in their public statements very carefully, so a lot can be inferred from the language they chose to use. First off, the use of the word “anticipate” means that this is Apple’s best guess. They previously anticipated that this feature would roll out this spring. This is not Apple confirming that these features will definitely be out in the next 12 months. So this uncertainty from Apple is definitely cause for concern. And, the use of the phrase “the coming year” could mean later this year, but also could mean as late as this time next year.
But let’s put all of that aside for a moment and talk about what is available right now. Let’s first talk about the writing tools. So right now we can use a worse version of ChatGPT to make an email either sound like it was written by a tenured academic born in the 40s, that’s with the professional option, or make it sound weirdly fake with a friendly option that sometimes even sounds passive aggressive. I’ve never found those options to be of any help. The writing tool that allows you summarize text I find to be practically useless, as summaries often leave out the most crucial pieces of information and include random irrelevant details. The rewrite tool often sounds like Apple is a middle school student being told to reword a paragraph to avoid plagiarism, it sounds like they’re pulling out a thesaurus and basically finding a synonym for every other word. The only writing tool I find even remotely helpful is the proofread feature, but that’s just grammarly baked into the operating system. You can also compose text with ChatGPT, but that’s not Apple Intelligence, that’s just them outsourcing a task to another AI assistant which is exactly what they were trying to get away from by differentiating themselves from those other services. So these writing tools are nowhere near the type of transformative and revolutionary intelligence system that Apple claimed they were introducing.
Now let’s talk about Image Playground, Genmoji, and Image Wand. Image Playground is a joke, the fact that Apple marketed this as something you’d use to send your friend an unnerving AI image of them with a party hat on for their birthday is ludicrous. I’ve generated some images of my friends and family, and while there’s sometimes a very vague resemblance, I’ve never gotten a positive reaction when I show it to them, it’s usually a lot of this (videos of people reacting) “Oh that’s weird, I don’t like that.” Definitely not something I’m gonna send my friends on their birthday (well, I might actually do that but that’s cuz I’m a bit of an asshole, so if you’re someone who actually wants to do nice things for your friends on their birthday, I’d advise not sending them an image playground image) Image Playground was something I used for about 30 minutes when it launched in beta back in October, and I can count on one hand the number of times I’ve opened the app since (well maybe 2 if you include the making of this video). If you look up the word gimmick in the dictionary, you’ll find Image Playground because it is the textbook definition of a gimmick. It’s unfortunate that Apple wasted any of their time and money on this “feature”.
Genmoji allows you to create your own custom emojis using on device generative AI. I can see this potentially being a hit with teenagers, I don’t use it much but yet again I’m not someone who often uses emojis in general. I think one thing that’s important to note is that in order to use Apple Intelligence, you need an iPhone 15 Pro or newer, or a Mac or iPad with an M1 chip or later. I’m going to make a guess based on no factual information and just what I feel is correct, which seems increasingly common these days, and suggest that most teenagers and young people who I imagine would be the primary users of this feature, probably don’t have an iPhone 15 Pro or newer! I could be wrong, but when I was a teenager I always got the hand me down iPhone which was more than a few years old usually, but perhaps times have changed. Regardless, you need a 15 Pro or newer in order to make a cowboy frog emoji? Seriously? And Image Wand, which I almost forgot about when planning out this video, allows you to transform your drawings in apple notes into AI generated images. I literally used this feature once when it came out, and just did it once again for this video but if you’ve found Image Wand to be useful, please let me know in the comments, perhaps all 3 of you can meet up or something.
There’s also notification summaries. These have been an abysmal failure. Even when these summaries are not spreading literal misinformation, they’re just super inaccurate and almost never capture the essence of what is being said. So much of my texting with my friends is playful and sarcastic, and the summaries do not pick that up whatsoever. I can’t name a single instance where the summaries were genuinely helpful, and since they’re so often wildly inaccurate, I used to have to read the messages anyway since I couldn’t trust the summaries. And I say “used to” because I turned them off! They never saved me time since I’d have to skip past the often inaccurate summary to just view the messages.
And that’s not even the worst part. Not only are the summaries not helpful, they’re actually harmful in some cases. Apple was recently in hot water over their notification summaries inaccurately summarizing news notifications in a way that often was inaccurate but not just inaccurate, but was often the exact opposite of the truth. The summaries were literally spreading misinformation, and Apple responded to a complaint by the BBC by turning off notification summaries entirely for news apps, and putting all summaries in italics which is apparently supposed to let you know that it’s an AI summary. And even when it’s not breaking news headlines, it gets other messages horribly wrong, like this user whose friend or someone they knew sent them a message that they were very drunk and went home and crashed straight into bed, which Apple Intelligence summarized as “Drunk and crashed.” Imagining that notification popping up on your phone from someone you care about. If you’re a parent, imagine getting that message from your kid. Fortunately my mom doesn’t have an iPhone capable of making a cowboy frog emoji, but if she did I would absolutely make sure notification summaries were off because she would nearly have a heart attack if she got a message like that, then she would hopefully expand the message, but knowing my mother she might just immediately call 911, so definitely no notification summaries for her.
And perhaps a slightly more mild case, but still a serious issue nonetheless, Joanna Stern, who is a great tech journalist for the Wall Street Journal, I highly recommend checking out her stuff, she is married to a woman, and notification summaries summarize messages from her wife and calls her wife her husband. Now, not only is this a beautiful lesson in how AI will always reflect the biases of its programmers, it’s also a demonstration of how unintelligent Apple Intelligence is. THE CONTACT NAME IS LITERALLY WIFE
This is a reoccurring theme but it bares repeating since I can’t get it over it, the notification summaries, and really all of these features, are so bad and so unfinished that I’m genuinely shocked Apple let them out of the testing floor.
They also added a couple new features to the photos app, (and I’m not talking about the redesign in iOS 18 that seems to be nearly universally hated as it made it all one page which apparently caused some people to not be able to find their favorites album for weeks). I’m talking about Clean up and Memory movies. Clean up allows you to remove unwanted items in photos so you can finally get rid of that photo bomber, or maybe your ex, and it works great! As long there’s a solid background and separation between the people, but once you try and remove someone or something that’s anywhere close to another person or thing or if the background is not solid, things get wonky pretty fast. Funnily enough, the most use I’ve gotten out of the clean up feature is actually when I need to blur something out. I use the clean up tool and it scrambles my name or serial number of a device or something like that so I can post images online while maintaining privacy. But other than that, it’s been pretty useless. Memory movies are an interesting addition, but I find them underwhelming. Usually the collage it creates is only like 30 seconds long, and the photos it chooses are once again unintelligent. For example, it’ll use 2 or 3 of almost the exact same image, instead of just using one and intelligently knowing not to use the others. So once again another half-baked feature that feels incomplete and doesn’t compel me to use it in any way.
We also have ChatGPT integration in Siri, which is nice because Siri is painful. But once again, this isn’t Apple Intelligence, this is Apple outsourcing to another AI system. And sometimes Siri will just not understand my request and also not forward the query to ChatGPT (Hey Siri, what processor is in the latest MacBook Pro?) and because of this, I’ve found it easier to just use the ChatGPT app! I set the action button on my iPhone to open ChatGPT advanced voice mode and quickly ask my question with much higher rates of success. (Hey Siri, what processor is in the latest MacBook Pro?)
I think it’s safe to say that the features I’ve mentioned so far, which is all the features that are currently available, are a far cry from “incredible capabilities.” Like I said earlier, it seems like a hodgepodge of features thrown together with no clear direction or use case. It’s nothing like the promised complete intelligence system that works together seamlessly to make your life easier. It’s nothing like what Apple’s marketing made it out to be.
So you might be wondering, Oliver then what features would actually be revolutionary, or at least actually useful?” Well I’m so glad you asked, and coincidentally they are the features we haven’t discussed yet since they aren’t out yet, even in beta. I’m talking about In App Actions and Personal Context. These features, if they ever end up working like Apple has been advertising, which is a BIG IF I know, these would actually make people’s lives easier and allow them to have access to more information at their fingertips. Being able to ask Siri what time your mom’s flight gets in and Apple Intelligence being able to quickly scan your inbox to find the flight itinerary or the text telling you when she arrives. Or being able to ask Siri to play the podcast that your wife sent the other day, and it can pull it from a text, email, or even a transcription of a voicemail. If features like that were available and worked as advertised, I would be very impressed and be glad to use them. But the reality is that they’re not here, despite Apple heavily using them for advertising since WWDC.
Not only are the features currently available incredibly lackluster, the marketing and ad campaigns showing off Apple Intelligence are unquestionably deceptive marketing. Here’s an ad that ran in September about the iPhone 16 Pro, showing Siri being able to access your personal context, a feature that now 6 months later, is still not available, and has been further delayed! Apple coincidentally just recently removed this ad from their YouTube channel, funny how that works.
Apple marketed the iPhone 16 line as the Apple Intelligence phone, and they were constantly saying, “Built from the ground up for Apple Intelligence.” That obviously is not true since zero Apple Intelligence features were available at launch, none! I don’t think the phones were built for Apple Intelligence I think it was more that Apple Intelligence hadn’t really been built at all. Despite every billboard and sign saying “iPhone 16, Hello Apple Intelligence” and the influx of ads marketing the phone with features like personal context that STILL AREN’T HERE! 6 months later, halfway through the phone’s cycle, and we still don’t have the main features that were used to try and sell you the device. Apple’s 5th Avenue store in New York City was lit up with the Apple Intelligence colors for the launch of the iPhone 16, all the display iPhones had the glow around the edges of the screen, DESPITE NOT HAVING THE FEATURE when you bought it! I’m going to show you a video I took on October 1st of last year, the date is important because that was before any Apple Intelligence features had been released to the public. I was at the Apple Store and noticed that all the display phones had the signature “Hello, Apple Intelligence” and also had glowing borders like the new Siri animation. I thought, that’s strange, Apple Intelligence hasn’t even been released yet, are these phones masquerading as AI equipped phones while still running the current public version of software which did not have Apple Intelligence? So I invoked Siri on one of the display phones and I got regular Siri! It’s almost funny how blatant the deception is. And the deception worked. There are numerous examples online of people getting their brand new iPhone 16 and being confused as to why they don’t have the one feature that was the most heavily marketed in nearly every advertisement! They were advertising a phone based on features that were supposed to come in the future, which now have been delayed for the foreseeable future. What’s next? Apple releases a self-driving car with the self-driving functionality coming later with a software update? And right now, with the new M4 MacBook Air that was just announced, they are still using Siri knowing your personal context to market the device, a feature that is not available now and was just confirmed to not be coming anytime soon. But thank goodness they were kind enough to include a footnote which at the bottom in small text says “Some features with become available in software updates in the coming months” I’m sure Apple definitely wants you to read that, that’s why it’s in super small text at the bottom of the website! Once again how many months? 12? Because the recent statement says the features will be rolling out in the coming year. Apple is selling an unfinished product, plain and simple. And due to the clearly deceptive ad campaign, advertising features not on the device, and leading consumers to reasonably conclude from their marketing that these features would be available on the device they bought, they are opening themselves up to huge class action lawsuits that are almost certainly coming.
So where does Apple go from here? Well since they’re definitely watching this video right now, they first need to apologize, like they did in response to the abysmal rollout of Apple Maps. But what they actually need to do before that, is stop enabling Apple Intelligence by default with new iPhones which isn’t how it was previously, you had to actively turn on Apple Intelligence, but in iOS 18.3 when users update or get a new iPhone, Apple Intelligence is enabled by default. I have to imagine this is because users weren’t turning it on because they didn’t know about it at all. Over thanksgiving last year I realized my cousin had an iPhone 16 and I asked her if she had been using Apple Intelligence at all. She said “What’s that?” Apple Intelligence should unquestionably be something users have to opt into, not out of. And those notification summaries gotta go entirely, like I said they fortunately disabled summaries for news apps, but it should be removed entirely, so that people aren’t potentially momentarily scared that their loved ones are in danger.
When I began to work on this video, I was still somewhat pro Apple Intelligence you could say, I was glad they released it. Sure I obviously still wasn’t too happy about the slow rollout and deceptive marketing, but as a whole I was excited, and eager for what was to come. But after researching for this video, watching all the Apple ads again, reading the news articles about how the summaries were spreading misinformation and in some cases scaring people. Seeing all the Reddit and Twitter posts of people who got their new iPhone 16 and were disappointed when there was no Apple Intelligence like they had been led to believe. Now I’m pretty firmly convinced that Apple should have never released Apple Intelligence. It’s becoming increasing clear that Apple capitulated to outside pressure, likely to investors and shareholders who were nagging them about how they were falling behind in the AI race. And now, not only are they still behind, but this abysmal rollout is already biting them in the behind.
One phrase that is often used to describe Apple is “They’re rarely first to something, but when they do it, they do it right.” Well, in this case, they were neither first nor right. I’m not gonna lie, Apple is gonna have to gain back my trust, and I suspect they have to gain back a lot of yours too. But they definitely can! I say all of this as an avid Apple user and mega fan. I use their products everyday, and I’m not gonna stop using them, because they’re great for so many other things. But sometimes when you care about someone, you need to show them some tough love. Apple, I care about you, you definitely stumbled big time here, but you’ve shown you have the ability to get back up even stronger than before. So, because I don’t want this video to end on a negative note, let’s end with a toast, I encourage you to join me if you’d like. I’m gonna use chocolate milk because it’s my favorite, if you haven’t been able to tell from my previous videos but feel free to use an adult beverage if you so choose and are of legal age! Don’t need any trouble there. But let us toast, to innovation done right, to lessons learned the hard way, and to comebacks stronger than the stumbles. Apple, you’ve set the bar high—now it’s time to rise again. Here’s to the next chapter, may it be one worth celebrating!
Thanks for watching (or reading in this case)