Found 5 bookmarks
Custom sorting
WWDC 2024: Apple Intelligence
WWDC 2024: Apple Intelligence
their models are almost entirely based on personal context, by way of an on-device semantic index. In broad strokes, this on-device semantic index can be thought of as a next-generation Spotlight. Apple is focusing on what it can do that no one else can on Apple devices, and not really even trying to compete against ChatGPT et al. for world-knowledge context. They’re focusing on unique differentiation, and eschewing commoditization.
Apple is doing what no one else can do: integrating generative AI into the frameworks in iOS and MacOS used by developers to create native apps. Apps built on the system APIs and frameworks will gain generative AI features for free, both in the sense that the features come automatically when the app is running on a device that meets the minimum specs to qualify for Apple Intelligence, and in the sense that Apple isn’t charging developers or users to utilize these features.
·daringfireball.net·
WWDC 2024: Apple Intelligence
Apple intelligence and AI maximalism — Benedict Evans
Apple intelligence and AI maximalism — Benedict Evans
The chatbot might replace all software with a prompt - ‘software is dead’. I’m skeptical about this, as I’ve written here, but Apple is proposing the opposite: that generative AI is a technology, not a product.
Apple is, I think, signalling a view that generative AI, and ChatGPT itself, is a commodity technology that is most useful when it is: Embedded in a system that gives it broader context about the user (which might be search, social, a device OS, or a vertical application) and Unbundled into individual features (ditto), which are inherently easier to run as small power-efficient models on small power-efficient devices on the edge (paid for by users, not your capex budget) - which is just as well, because… This stuff will never work for the mass-market if we have marginal cost every time the user presses ‘OK’ and we need a fleet of new nuclear power-stations to run it all.
Apple has built its own foundation models, which (on the benchmarks it published) are comparable to anything else on the market, but there’s nowhere that you can plug a raw prompt directly into the model and get a raw output back - there are always sets of buttons and options shaping what you ask, and that’s presented to the user in different ways for different features. In most of these features, there’s no visible bot at all. You don’t ask a question and get a response: instead, your emails are prioritised, or you press ‘summarise’ and a summary appears. You can type a request into Siri (and Siri itself is only one of the many features using Apple’s models), but even then you don’t get raw model output back: you get GUI. The LLM is abstracted away as an API call.
Apple is treating this as a technology to enable new classes of features and capabilities, where there is design and product management shaping what the technology does and what the user sees, not as an oracle that you ask for things.
Apple is drawing a split between a ‘context model’ and a ‘world model’. Apple’s models have access to all the context that your phone has about you, powering those features, and this is all private, both on device and in Apple’s ‘Private Cloud’. But if you ask for ideas for what to make with a photo of your grocery shopping, then this is no longer about your context, and Apple will offer to send that to a third-party world model - today, ChatGPT.
that’s clearly separated into a different experience where you should have different expectations, and it’s also, of course, OpenAI’s brand risk, not Apple’s. Meanwhile, that world model gets none of your context, only your one-off prompt.
Neither OpenAI nor any of the other cloud models from new companies (Anthropic, Mistral etc) have your emails, messages, locations, photos, files and so on.
Apple is letting OpenAI take the brand risk of creating pizza glue recipes, and making error rates and abuse someone else’s problem, while Apple watches from a safe distance.
The next step, probably, is to take bids from Bing and Google for the default slot, but meanwhile, more and more use-cases will be quietly shifted from the third party to Apple’s own models. It’s Apple’s own software that decides where the queries go, after all, and which ones need the third party at all.
A lot of the compute to run Apple Intelligence is in end-user devices paid for by the users, not Apple’s capex budget, and Apple Intelligence is free.
Commoditisation is often also integration. There was a time when ‘spell check’ was a separate product that you had to buy, for hundreds of dollars, and there were dozens of competing products on the market, but over time it was integrated first into the word processor and then the OS. The same thing happened with the last wave of machine learning - style transfer or image recognition were products for five minutes and then became features. Today ‘summarise this document’ is AI, and you need a cloud LLM that costs $20/month, but tomorrow the OS will do that for free. ‘AI is whatever doesn’t work yet.’
Apple is big enough to take its own path, just as it did moving the Mac to its own silicon: it controls the software and APIs on top of the silicon that are the basis of those developer network effects, and it has a world class chip team and privileged access to TSMC.
Apple is doing something slightly different - it’s proposing a single context model for everything you do on your phone, and powering features from that, rather than adding disconnected LLM-powered features at disconnected points across the company.
·ben-evans.com·
Apple intelligence and AI maximalism — Benedict Evans
What Is Going On With Next-Generation Apple CarPlay?
What Is Going On With Next-Generation Apple CarPlay?
I’d posit that a reason why people love CarPlay so much is because the media, communication, and navigation experiences have traditionally been pretty poor. CarPlay supplants those, and it does so with aplomb because people use those same media, communication, and navigation features that are personalized to them with their phones when they’re not in their cars.
No one is walking around with a speedometer and a tachometer on their iPhone that need to have a familiar look and feel, rendered exclusively in San Francisco. As long as automakers supply the existing level of CarPlay support, which isn’t a given, then customers like us would be content with the status quo, or even a slight improvement.
In my humble opinion, Next-Gen CarPlay is dead on arrival. Too late, too complicated, and it doesn’t solve the needs of automakers or customers. Instead of letting the vehicle’s interface peak through, Apple should consider letting CarPlay peak through for the non-critical systems people prefer to use with CarPlay.
Design a CarPlay that can output multiple display streams (which Apple already over-designed) and display that in the cluster. Integrate with the existing controls for managing the interfaces in the vehicle. When the phone isn’t there, the vehicle will still be the same vehicle. When the phone is there, it’s got Apple Maps right in the cluster how you like it without changing the gauges, or the climate controls, or where the seat massage button is.
The everyday irritations people have are mundane, practical, and are not related to how Apple-like their car displays can look.
·joe-steel.com·
What Is Going On With Next-Generation Apple CarPlay?
Apple Intelligence is Right On Time
Apple Intelligence is Right On Time

Summary

  • Apple remains primarily a hardware company, and an AI-mediated future will still require devices, playing to Apple's strengths in design and integration.
  • AI is a complement to Apple's business, not disruptive, as it makes high-performance hardware more relevant and could drive meaningful iPhone upgrade cycles.
  • The smartphone is the ideal device for most computing tasks and the platform on which the future happens, solidifying the relevance of Apple's App Store ecosystem.
  • Apple's partnership with OpenAI for chatbot functionality allows it to offer best-in-class capabilities without massive investments, while reducing the threat of OpenAI building a competing device.
  • Building out the infrastructure for API-level AI features is a challenge for Apple, but one that is solvable given its control over the interface and integration of on-device and cloud processing.
  • The only significant threat to Apple is Google, which could potentially develop differentiated AI capabilities for Android that drive switching from iPhone users, though this is uncertain.
  • Microsoft's missteps with its Recall feature demonstrate the risks of pushing AI features too aggressively, validating Apple's more cautious approach.
  • Apple's user-centric orientation and brand promise of privacy and security align well with the need to deliver AI features in an integrated, trustworthy manner.
·stratechery.com·
Apple Intelligence is Right On Time