Found 5 bookmarks
Custom sorting
Looking for AI use-cases — Benedict Evans
Looking for AI use-cases — Benedict Evans
  • LLMs have impressive capabilities, but many people struggle to find immediate use-cases that match their own needs and workflows.
  • Realizing the potential of LLMs requires not just technical advancements, but also identifying specific problems that can be automated and building dedicated applications around them.
  • The adoption of new technologies often follows a pattern of initially trying to fit them into existing workflows, before eventually changing workflows to better leverage the new tools.
if you had showed VisiCalc to a lawyer or a graphic designer, their response might well have been ‘that’s amazing, and maybe my book-keeper should see this, but I don’t do that’. Lawyers needed a word processor, and graphic designers needed (say) Postscript, Pagemaker and Photoshop, and that took longer.
I’ve been thinking about this problem a lot in the last 18 months, as I’ve experimented with ChatGPT, Gemini, Claude and all the other chatbots that have sprouted up: ‘this is amazing, but I don’t have that use-case’.
A spreadsheet can’t do word processing or graphic design, and a PC can do all of those but someone needs to write those applications for you first, one use-case at a time.
no matter how good the tech is, you have to think of the use-case. You have to see it. You have to notice something you spend a lot of time doing and realise that it could be automated with a tool like this.
Some of this is about imagination, and familiarity. It reminds me a little of the early days of Google, when we were so used to hand-crafting our solutions to problems that it took time to realise that you could ‘just Google that’.
This is also, perhaps, matching a classic pattern for the adoption of new technology: you start by making it fit the things you already do, where it’s easy and obvious to see that this is a use-case, if you have one, and then later, over time, you change the way you work to fit the new tool.
The concept of product-market fit is that normally you have to iterate your idea of the product and your idea of the use-case and customer towards each other - and then you need sales.
Meanwhile, spreadsheets were both a use-case for a PC and a general-purpose substrate in their own right, just as email or SQL might be, and yet all of those have been unbundled. The typical big company today uses hundreds of different SaaS apps, all them, so to speak, unbundling something out of Excel, Oracle or Outlook. All of them, at their core, are an idea for a problem and an idea for a workflow to solve that problem, that is easier to grasp and deploy than saying ‘you could do that in Excel!’ Rather, you instantiate the problem and the solution in software - ‘wrap it’, indeed - and sell that to a CIO. You sell them a problem.
there’s a ‘Cambrian Explosion’ of startups using OpenAI or Anthropic APIs to build single-purpose dedicated apps that aim at one problem and wrap it in hand-built UI, tooling and enterprise sales, much as a previous generation did with SQL.
Back in 1982, my father had one (1) electric drill, but since then tool companies have turned that into a whole constellation of battery-powered electric hole-makers. One upon a time every startup had SQL inside, but that wasn’t the product, and now every startup will have LLMs inside.
people are still creating companies based on realising that X or Y is a problem, realising that it can be turned into pattern recognition, and then going out and selling that problem.
A GUI tells the users what they can do, but it also tells the computer everything we already know about the problem, and with a general-purpose, open-ended prompt, the user has to think of all of that themselves, every single time, or hope it’s already in the training data. So, can the GUI itself be generative? Or do we need another whole generation of Dan Bricklins to see the problem, and then turn it into apps, thousands of them, one at a time, each of them with some LLM somewhere under the hood?
The change would be that these new use-cases would be things that are still automated one-at-a-time, but that could not have been automated before, or that would have needed far more software (and capital) to automate. That would make LLMs the new SQL, not the new HAL9000.
·ben-evans.com·
Looking for AI use-cases — Benedict Evans
Vision Pro is an over-engineered “devkit” // Hardware bleeds genius & audacity but software story is disheartening // What we got wrong at Oculus that Apple got right // Why Meta could finally have its Android moment
Vision Pro is an over-engineered “devkit” // Hardware bleeds genius & audacity but software story is disheartening // What we got wrong at Oculus that Apple got right // Why Meta could finally have its Android moment
Some of the topics I touch on: Why I believe Vision Pro may be an over-engineered “devkit” The genius & audacity behind some of Apple’s hardware decisions Gaze & pinch is an incredible UI superpower and major industry ah-ha moment Why the Vision Pro software/content story is so dull and unimaginative Why most people won’t use Vision Pro for watching TV/movies Apple’s bet in immersive video is a total game-changer for live sports Why I returned my Vision Pro… and my Top 10 wishlist to reconsider Apple’s VR debut is the best thing that ever happened to Oculus/Meta My unsolicited product advice to Meta for Quest Pro 2 and beyond
Apple really played it safe in the design of this first VR product by over-engineering it. For starters, Vision Pro ships with more sensors than what’s likely necessary to deliver Apple’s intended experience. This is typical in a first-generation product that’s been under development for so many years. It makes Vision Pro start to feel like a devkit.
A sensor party: 6 tracking cameras, 2 passthrough cameras, 2 depth sensors(plus 4 eye-tracking cameras not shown)
it’s easy to understand two particularly important decisions Apple made for the Vision Pro launch: Designing an incredible in-store Vision Pro demo experience, with the primary goal of getting as many people as possible to experience the magic of VR through Apple’s lenses — most of whom have no intention to even consider a $4,000 purchase. The demo is only secondarily focused on actually selling Vision Pro headsets. Launching an iconic woven strap that photographs beautifully even though this strap simply isn’t comfortable enough for the vast majority of head shapes. It’s easy to conclude that this decision paid off because nearly every bit of media coverage (including and especially third-party reviews on YouTube) uses the woven strap despite the fact that it’s less comfortable than the dual loop strap that’s “hidden in the box”.
Apple’s relentless and uncompromising hardware insanity is largely what made it possible for such a high-res display to exist in a VR headset, and it’s clear that this product couldn’t possibly have launched much sooner than 2024 for one simple limiting factor — the maturity of micro-OLED displays plus the existence of power-efficient chipsets that can deliver the heavy compute required to drive this kind of display (i.e. the M2).
·hugo.blog·
Vision Pro is an over-engineered “devkit” // Hardware bleeds genius & audacity but software story is disheartening // What we got wrong at Oculus that Apple got right // Why Meta could finally have its Android moment
Optimizing For Feelings
Optimizing For Feelings
Humor us for a moment and picture your favorite neighborhood restaurant. Ours is a corner spot in Fort Greene, Brooklyn. It has overflowing natural light, handmade textile seat cushions, a caramel wood grain throughout, and colorful ornaments dangling from the ceilings. Can you picture yours? Do you feel the warmth and spirit of the place?A Silicon Valley optimizer might say, “Well, they don’t brew their coffee at exactly 200 degrees. And the seats look a little ratty. And the ceiling ornaments don’t serve any function.”But we think that’s exactly the point. That these little, hand-crafted touches give our environment its humanity and spirit. In their absence, we’re left with something universal but utterly sterile — a space that may “perfectly” serve our functional needs, but leave our emotional needs in the lurch.
Operating systems were bubbly and evanescent, like nature. Apps were customizable, in every shape and size. And interfaces drew on real-life metaphors to help you understand them, integrating them effortlessly into your life.But as our everyday software tools and media became global for the first time, the hand of the artist gave way to the whims of the algorithm. And our software became one-size-fits-all in a world full of so many different people. All our opinions, beliefs, and ideas got averaged out — producing the least common denominator: endless sequels that everyone enjoys but no one truly loves.When our software optimizes for numbers alone — no matter the number — it appears doomed to lack a certain spirit, and a certain humanity.
In the end, we decided that we didn’t want to optimize for numbers at all. We wanted to optimize for feelings.While this may seem idealistic at best or naive at worst, the truth is that we already know how to do this. The most profound craftsmanship in our world across art, design, and media has long revolved around feelings.
When Olmstead crafted Central Park, what do you think he was optimizing for? Which metric led to Barry Jenkins’ Moonlight? What data brought the iPhone into this world? The answer is not numerical. It’s all about the feelings, opinions, experiences, and ideas of the maker themself. The great Georgia O’Keefe put it this way: "I have things in my head that are not like what anyone has taught me... so I decided to start anew."
Starting with feelings and then using data/metrics to bolster that feeling
James Turrell took inspiration from astronomy and perceptual psychology. Coco Chanel was most influenced by nuns and religious symbols. David Adjaye drew from Yoruban sculpture, and Steve Jobs from Zen Buddhism and calligraphy.
And yet, in so much modern software today, you’re placed in a drab gray cubicle — anonymized and aggregated until you’re just a daily active user. For minimalism. For simplicity. For scale! But if our hope is to create software with feeling, it means inviting people in to craft it for themselves — to mold it to the contours of their unique lives and taste.
You see — if software is to have soul, it must feel more like the world around it. Which is the biggest clue of all that feeling is what’s missing from today’s software. Because the value of the tools, objects, and artworks that we as humans have surrounded ourselves with for thousands of years goes so far beyond their functionality. In many ways, their primary value might often come from how they make us feel by triggering a memory, helping us carry on a tradition, stimulating our senses, or just creating a moment of peace.This is not to say that metrics should not play a role in what we do. The age of metrics has undeniably led us to some pretty remarkable things! And numbers are a useful measuring stick to keep ourselves honest.But if the religion of technology preaches anything, it celebrates progress and evolution. And so we ask, what comes next? What do we optimize for beyond numbers? How do we bring more of the world around us back into the software in front of us?
·browsercompany.substack.com·
Optimizing For Feelings