Of COURSE Apple decided to commandeer the AI abbreviation. Of course. That’s what Apple does; it’s who they are.
Famously, Apple always “waits for new technologies to mature a bit” before “doing it better than everyone else.” Except, they don’t. They are usually late to the party, true. But what they actually come of the gate with is what they are “going” to have, eventually. Maybe. Or not. And it isn’t necessarily better than Google’s 3rd or 5th or 7th iteration of it by the time Apple announces their version 1 beta.
Anyone out there (outside of people who make a living buying Apple things and discussing them via writing, podcast, and/or Youtube) actually have the misfortune of plunking down a few grand on Apple’s VR/AR googgles? No? Neither did I. And those that did, wish they didn’t. Yes, they are cool. Amazing, even (from what I’ve read and watched and heard). But as was and still is abundantly clear to anyone not caught up in the ghost of Steve’s Reality Distortion Field, there is simply no use case for them at this time. Maybe not for any time, for that matter: by the time any developers decide that the market of thousands (not millions) of users would be worth addressing, something else will have supplanted it.
Microsoft learned this lesson the hard way over a period of over a decade with multiple iterations of its HoloLens goggles (which were equally jaw-dropping in their day). And I hate to admit it, but Facebook/Meta was actually smarter about it, because their founding DNA is not about R&D and doing things that have never been done in order to create a new world, unlike Microsoft and Apple; the force that drives their existence is eyeballs. How did they go about their goggles play? By making them as cheaply as possible, while still being “good enough” to get people to use them. And they actually have an ecosystem, thanks to that decision. Will you have your very reality altered after trying them on for the first time? No. But are you more likely to keep using them, rather than tossing them aside after the initial wow factor wears off and until devs actually bother developing for it? Yes you are!
Which brings us to Apple Intelligence. We all know better than to associate the word “Apple” with the term “AI,” despite Apple’s pathetic and cynical attempt (no doubt to be backed by tens, if not hundreds, of millions of dollars’ worth of advertising) to co-opt the ubiquitous acronym that has been in use for decades. Not to mention the fact that none of it actually, you know, EXISTS yet for non-beta testers, and won’t for several months. Even then, only a fraction of what was demo’d a few days ago will roll out; the rest will arrive incrementally over the next year, if at all. Not everything makes it to public release, after all; not just for Apple, but for any company.
Apple is dominant, and that won’t change in 2024 or ’25, perhaps not even in 2026. If they had waited yet another year to flail and jump onto the AI hype train, they may have been in trouble earlier than that, though I kind of doubt it. But for the first time since the unveiling of the iPhone back in 2007, they were (or at least ought to have been) scared of becoming irrelevant. Though I remain thoroughly unimpressed by the reality of what was shown in demo form at WWDC 2024, it was certainly enough to keep would-be competitors at bay, at least for a few more years. By then, by the time it becomes clear to all that Apple’s way cannot compete with Google’s and Microsoft’s, the phone itself may not matter as much, which will make the AI panic in Cupertino over the past 9 months seem like a walk in the (Apple) Park.