Found 12 bookmarks
Custom sorting
Vision Pro is an over-engineered “devkit” // Hardware bleeds genius & audacity but software story is disheartening // What we got wrong at Oculus that Apple got right // Why Meta could finally have its Android moment
Vision Pro is an over-engineered “devkit” // Hardware bleeds genius & audacity but software story is disheartening // What we got wrong at Oculus that Apple got right // Why Meta could finally have its Android moment
Some of the topics I touch on: Why I believe Vision Pro may be an over-engineered “devkit” The genius & audacity behind some of Apple’s hardware decisions Gaze & pinch is an incredible UI superpower and major industry ah-ha moment Why the Vision Pro software/content story is so dull and unimaginative Why most people won’t use Vision Pro for watching TV/movies Apple’s bet in immersive video is a total game-changer for live sports Why I returned my Vision Pro… and my Top 10 wishlist to reconsider Apple’s VR debut is the best thing that ever happened to Oculus/Meta My unsolicited product advice to Meta for Quest Pro 2 and beyond
Apple really played it safe in the design of this first VR product by over-engineering it. For starters, Vision Pro ships with more sensors than what’s likely necessary to deliver Apple’s intended experience. This is typical in a first-generation product that’s been under development for so many years. It makes Vision Pro start to feel like a devkit.
A sensor party: 6 tracking cameras, 2 passthrough cameras, 2 depth sensors(plus 4 eye-tracking cameras not shown)
it’s easy to understand two particularly important decisions Apple made for the Vision Pro launch: Designing an incredible in-store Vision Pro demo experience, with the primary goal of getting as many people as possible to experience the magic of VR through Apple’s lenses — most of whom have no intention to even consider a $4,000 purchase. The demo is only secondarily focused on actually selling Vision Pro headsets. Launching an iconic woven strap that photographs beautifully even though this strap simply isn’t comfortable enough for the vast majority of head shapes. It’s easy to conclude that this decision paid off because nearly every bit of media coverage (including and especially third-party reviews on YouTube) uses the woven strap despite the fact that it’s less comfortable than the dual loop strap that’s “hidden in the box”.
Apple’s relentless and uncompromising hardware insanity is largely what made it possible for such a high-res display to exist in a VR headset, and it’s clear that this product couldn’t possibly have launched much sooner than 2024 for one simple limiting factor — the maturity of micro-OLED displays plus the existence of power-efficient chipsets that can deliver the heavy compute required to drive this kind of display (i.e. the M2).
·hugo.blog·
Vision Pro is an over-engineered “devkit” // Hardware bleeds genius & audacity but software story is disheartening // What we got wrong at Oculus that Apple got right // Why Meta could finally have its Android moment
The VR winter — Benedict Evans
The VR winter — Benedict Evans
When I started my career 3G was the hot topic, and every investor kept asking ‘what’s the killer app for 3G?’ It turned out that the killer app for having the internet in your pocket was, well, having the internet in your pocket. But with each of those, we knew what to build next, and with VR we don’t. That tells me that VR has a place in the future. It just doesn’t tell me what kind of place.
The successor to the smartphone will be something that doesn’t just merge AR and VR but make the distinction irrelevant - something that you can wear all day every day, and that can seamlessly both occlude and supplement the real world and generate indistinguishable volumetric space.
·ben-evans.com·
The VR winter — Benedict Evans
Leviathan Wakes: the case for Apple's Vision Pro - Redeem Tomorrow
Leviathan Wakes: the case for Apple's Vision Pro - Redeem Tomorrow
The existing VR hardware has not received sufficient investment to fully demonstrate the potential of this technology. It is unclear whether the issues lie with augmented reality (AR) itself or the technology used to deliver it. However, Apple has taken a different approach by investing significantly in creating a serious computer with an optical overlay as its primary interface. Unlike other expensive headsets, Apple has integrated the ecosystem to make it appealing right out of the box, allowing users to watch movies, view photos, and run various apps. This comprehensive solution aims to address the uncertainties surrounding AR. The display quality is top-notch, finger-based interaction replaces clunky joysticks, and performance is optimized to minimize motion sickness. Furthermore, a large and experienced developer community stands ready to create apps, supported by mature tools and extensive documentation. With these factors in place, there is anticipation for a new paradigm enabled by a virtually limitless monitor. The author expresses eagerness to witness how this technology unfolds.
What can you do with this thing? There’s a good chance that, whatever killer apps may emerge, they don’t need the entire complement of sensors and widgets to deliver a great experience. As that’s discovered, Apple will be able to open a second tier in this category and sell you a simplified model at a lower cost. Meanwhile, the more they manufacture the essentials—high density displays, for example—the higher their yields will become, the more their margins will increase. It takes time to perfect manufacturing processes and build up capacity. Vision Pro isn’t just about 2024’s model. It’s setting up the conditions for Apple to build the next five years of augmented reality wearable technology.
VR/AR doesn’t have to suck ass. It doesn’t have to give you motion sickness. It doesn’t have to use these awkward, stupid controllers you accidentally swap into the wrong hand. It doesn’t have to be fundamentally isolating. If this paradigm shift could have been kicked off by cheap shit, we’d be there already. May as well pursue the other end of the market.
what starts as clunky needn’t remain so. As the technology for augmented reality becomes more affordable, more lightweight, more energy efficient, more stylish, it will be more feasible for more people to use. In the bargain, we’ll get a display technology entirely unshackled from the constraints of a monitor stand. We’ll have much broader canvases subject to the flexibility of digital creativity, collaboration and expression. What this unlocks, we can’t say.
·redeem-tomorrow.com·
Leviathan Wakes: the case for Apple's Vision Pro - Redeem Tomorrow
Mark Zuckerberg's Ugly Future
Mark Zuckerberg's Ugly Future
I’ve also seen a lot of users on Twitter asking “who is Horizon Worlds for?” And it’s a good question. I have an Oculus. Meta’s core metaverse platform, the thing that ostensively will be replacing Facebook soon as Meta’s main online portal, the central OS for the company’s VR world, is too boring for children, too complicated for old people, too time-consuming for anyone raising a family, and, though, it might eventually be good enough to function as some kind of inescapable cyberhell for white collar workers to have endless meetings inside of, at the moment it's hard to imagine a real use case for it. Except for one. I’ve come to conclusion that Meta’s metaversal aspirations are just a cold and cynical bet on a future where we just can’t go outside anymore. Meta’s big plan is to spend the next few years cobbling together something with enough baseline functionality that we can all migrate to it during the next pandemic. That’s the only explanation for the absolutely deranged amount of misplaced optimism Meta has about this stuff. This is a company who has decided they can make a lot of money off a catastrophic future by forcing us into their genital-free off-brand-Pixar panopticon and mining us for data while we Farmville ourselves to death.
·garbageday.email·
Mark Zuckerberg's Ugly Future
Stepping out of the firehose — Benedict Evans
Stepping out of the firehose — Benedict Evans
on information overload / infinite choice and how we struggle to manage it
The internet is a firehose. I don’t, myself, have 351 thousand unread emails, but when anyone can publish and connecting and sharing is free and frictionless, then there is always far more than we can possibly read. So how do we engage with that?
So your feed becomes a sample - an informed guess of the posts you might like most. This has always been a paradox of Facebook product - half the engineers work on adding stuff to your feed and the other half on taking stuff out. Snap proposed a different model - that if everything disappears after 24 hours then there’s less pressure to be great but also less pressure to read everything. You can let go. Tiktok takes this a step further - the feed is infinite, and there’s no pressure to get to the end, but also no signal to stop swiping. You replace pressure with addiction.
Another approach is to try to move the messages. Slack took emails from robots (support tickets, Salesforce updates) and moved them into channels, but now you have 50 channels full of unread messages instead of one inbox full of unread messages.
Screenshots are the PDFs of the smartphone. You pull something into physical space, sever all its links and metadata, and own it yourself.
Email newsletters look a little like this as well. I think a big part of the reason that people seem readier to pay for a blog post by email than a blog post on a web page is that somehow an email feels like a tangible, almost physical object - it might be part of that vast compost heap of unread emails, but at least it’s something that you have, and can come back to. This is also part of the resurgence of vinyl, and even audio cassettes.
The film-camera industry peaked at 80bn consumer photos a year, but today that number is well into the trillions, as I wrote here. That’s probably why people keep making camera apps with built-in constraints, but it also prompts a comparison with this summer’s NFT frenzy. Can digital objects have value, and can a signature add scarcity to a JPEG - can it make it individual?
there are now close to 5bn people with a smartphone, and all of us are online and saying and doing things, and you will never be able to read everything ever again. There’s an old line that Erasmus, in the 15th century, was the last person to have read everything - every book that there was - which might not have been literally possible but which was at least conceivable. Yahoo tried to read everything too - it tried to build a manually curated index of the entire internet that reached 3.2m sites before the absurdity of the project became overwhelming. This was Borges’s 1:1 scale map made real. So, we keep building tools, but also we let go. That’s part of the progression - Arts and Crafts was a reaction against what became the machine age, but Bauhaus and Futurism embraced it. If the ‘metaverse’ means anything, it reflects that we have all grown up with this now, and we’re looking at ways to absorb it, internalise it and reflect it in our lives and in popular culture - to take ownership of it. When software eats the world, it’s not software anymore.
·ben-evans.com·
Stepping out of the firehose — Benedict Evans