Saved

Saved

3651 bookmarks
Custom sorting
Computers that live two seconds in the future
Computers that live two seconds in the future
46% of all new code of GitHub is written using Copilot. (Oh and 75% of developers feel more fulfilled.)
What does it mean that computers can peer a tiny distance into the future? I have the vaguest of vague senses that a few things I’ve seen recently are conceptually connected.
And, to highlight one particularly wild point, if you virtual objects to feel real, then the computer has to PEER INTO OUR (subjective) FUTURE to get ready to react.
Wouldn’t it be great… if you could see… into the future… of the ocean. WELL. Here’s WavePredictor by Next Ocean. First they continuously scan the water around the ship with radar. Then: WavePredictor propagates the observed waves into the future resulting in a near future prediction of the waves arriving at the ship and the resulting ship motions.
It’s not just about avoiding freak big movements. It’s the reverse too: Pick the right moment to hook onto the load on deck when motions are temporarily low. Faster-than-realtime simulation of ocean waves to anticipate moments of still.
I skip forwards across realtime when writing with Copilot. Type two lines manually, receive and get the suggestion in spectral text, tab to accept, start typing again…
the personal computer OS has a model of what’s in the user’s working memory (the screen) and the user’s focus (the cursor) and therefore apps can be interactive the mobile computer OS has a native model of the context of the user (their geographic location) and their communication networks, and therefore we got apps like Google Maps and Facebook the spatial computing OS contains a model of the room, and so we’ll get augmented reality And therefore: the future computing OS contains of the model of the future and so all apps will be able to anticipate possible futures and pick over them, faster than realtime, and so… …?
·interconnected.org·
Computers that live two seconds in the future
How Apple Can Bring Down the Price of Apple Vision Headset From $3,500
How Apple Can Bring Down the Price of Apple Vision Headset From $3,500
It’s no surprise that Apple considered holding off on announcing the price until months after WWDC to avoid the negative headlines. In the end, the company decided the price would become an even larger target if it wasn’t revealed at the event. Consumers now have nine months to digest it before the Vision Pro goes on sale.
·bloomberg.com·
How Apple Can Bring Down the Price of Apple Vision Headset From $3,500
Isn’t That Spatial? | No Mercy / No Malice
Isn’t That Spatial? | No Mercy / No Malice
Betting against a first-generation Apple product is a bad trade — from infamous dismissals of the iPhone to disappointment with the original iPad. In fact, this is a reflection of Apple’s strategy: Start with a product that’s more an elegant proof-of-concept than a prime-time hit; rely on early adopters to provide enough runway for its engineers to keep iterating; and trust in unmatched capital, talent, brand equity, and staying power to morph a first-gen toy into a third-gen triumph
We are a long way from making three screens, a glass shield, and an array of supporting hardware light enough to wear for an extended period. Reviewers were (purposefully) allowed to wear the Vision Pro for less than half an hour, and nearly every one said comfort was declining even then. Avatar: The Way of Water is 3 hours and 12 minutes.
Meta’s singular strategic objective is to escape second-tier status and, like Apple and Alphabet, control its distribution. And its path to independence runs through Apple Park. Zuckerberg is spending the GDP of a small country to invent a new world, the metaverse, where Apple doesn’t own the roads or power stations. Vision Pro is insurance against the metaverse evolving into anything more than an incel panic room.
The only product category where VR makes difference is good VR games. Price is not limiting factor, the quality of VR experience is. Beat Saber is good and fun and physical exercise. Half Life: Alyx, is amazing. VR completely supercharges horror games, and scary stalking shooters. Want to fear of your life and get PTSD in the comfort of your home? You can do it. Games can connect people and provide physical exercise. If the 3rd iteration of Vision Pro is good for 2 hours of playing for $2000 Apple will kill the console market. Playstations no more. Apple is not a gaming company, but if Vision Pro becomes better and slightly cheaper, Apple becomes gaming company against its will.
·profgalloway.com·
Isn’t That Spatial? | No Mercy / No Malice
Theory of Constraints 102: The Illusion of Local Optima - Forte Labs
Theory of Constraints 102: The Illusion of Local Optima - Forte Labs
This discusses how trying to improve each part of a complex system can lead to an overall under-optimized system. It uses the example of a company where each department is like a section of pipe, with work flowing from left to right. If the Engineering department is the bottleneck, with the lowest staff and capacity, then the rule to "stay busy" will lead to local optima, with departments starting new projects to fill their capacity. This causes work to pile up at the bottleneck, leading to decreased throughput, conflict, and inefficiency. The only way to improve the system as a whole is to optimize the bottleneck, not each individual part.
·fortelabs.com·
Theory of Constraints 102: The Illusion of Local Optima - Forte Labs
Method
Method
“Ping Practice is a method I'm developing for translating everyday experiences into insights and actions that align with what you need and value.”
·ping-practice.gitbook.io·
Method
r/apolloapp - 📣 Apollo will close down on June 30th. Reddit’s recent decisions and actions have unfortunately made it impossible for Apollo to continue. Thank you so, so much for all the support over the years. ❤️
r/apolloapp - 📣 Apollo will close down on June 30th. Reddit’s recent decisions and actions have unfortunately made it impossible for Apollo to continue. Thank you so, so much for all the support over the years. ❤️
·reddit.com·
r/apolloapp - 📣 Apollo will close down on June 30th. Reddit’s recent decisions and actions have unfortunately made it impossible for Apollo to continue. Thank you so, so much for all the support over the years. ❤️
Design with SwiftUI - WWDC23 - Videos - Apple Developer
Design with SwiftUI - WWDC23 - Videos - Apple Developer
The products that we build contain complex flows and highly interactive elements. As a result, there's so many important decisions that we need to make. SwiftUI helps by quickly surfacing all of those important details that need your attention, for example, how an image should look when it's loading or how a button appears when it's pressed. These are the types of things that make a product feel complete. They're easily hidden in static design tools but are quickly surfaced when working in a dynamic tool like SwiftUI.That's because SwiftUI makes it easy to build your designs on device. In doing this, you gain a more complete understanding of what you're making. Separate parts now interact together, and you can begin to evaluate the experience as a whole. This process quickly reveals what's working in your design and what still needs attention or polish. On Maps, we've found this to be tremendously helpful.
·developer.apple.com·
Design with SwiftUI - WWDC23 - Videos - Apple Developer
Apple Vision
Apple Vision
Apple Vision is technically a VR device that experientially is an AR device, and it’s one of those solutions that, once you have experienced it, is so obviously the correct implementation that it’s hard to believe there was ever any other possible approach to the general concept of computerized glasses.
the Vision is taking that captured image, processing it, and displaying it in front of your eyes in around 4 milliseconds.
Real-time operating systems are used in embedded systems for applications with critical functionality, like a car, for example: it’s ok to have an infotainment system that sometimes hangs or even crashes, in exchange for more flexibility and capability, but the software that actually operates the vehicle has to be reliable and unfailingly fast. This is, in broad strokes, one way to think about how visionOS works: while the user experience is a time-sharing operating system that is indeed a variation of iOS, and runs on the M2 chip, there is a subsystem that primarily operates the R1 chip that is real-time; this means that even if visionOS hangs or crashes, the outside world is still rendered under that magic 12 milliseconds.
I’ll be honest: what this looked like to me was a divorced dad, alone at home with his Vision Pro, perhaps because his wife was irritated at the extent to which he got lost in his own virtual experience.
·stratechery.com·
Apple Vision
Leviathan Wakes: the case for Apple's Vision Pro - Redeem Tomorrow
Leviathan Wakes: the case for Apple's Vision Pro - Redeem Tomorrow
The existing VR hardware has not received sufficient investment to fully demonstrate the potential of this technology. It is unclear whether the issues lie with augmented reality (AR) itself or the technology used to deliver it. However, Apple has taken a different approach by investing significantly in creating a serious computer with an optical overlay as its primary interface. Unlike other expensive headsets, Apple has integrated the ecosystem to make it appealing right out of the box, allowing users to watch movies, view photos, and run various apps. This comprehensive solution aims to address the uncertainties surrounding AR. The display quality is top-notch, finger-based interaction replaces clunky joysticks, and performance is optimized to minimize motion sickness. Furthermore, a large and experienced developer community stands ready to create apps, supported by mature tools and extensive documentation. With these factors in place, there is anticipation for a new paradigm enabled by a virtually limitless monitor. The author expresses eagerness to witness how this technology unfolds.
What can you do with this thing? There’s a good chance that, whatever killer apps may emerge, they don’t need the entire complement of sensors and widgets to deliver a great experience. As that’s discovered, Apple will be able to open a second tier in this category and sell you a simplified model at a lower cost. Meanwhile, the more they manufacture the essentials—high density displays, for example—the higher their yields will become, the more their margins will increase. It takes time to perfect manufacturing processes and build up capacity. Vision Pro isn’t just about 2024’s model. It’s setting up the conditions for Apple to build the next five years of augmented reality wearable technology.
VR/AR doesn’t have to suck ass. It doesn’t have to give you motion sickness. It doesn’t have to use these awkward, stupid controllers you accidentally swap into the wrong hand. It doesn’t have to be fundamentally isolating. If this paradigm shift could have been kicked off by cheap shit, we’d be there already. May as well pursue the other end of the market.
what starts as clunky needn’t remain so. As the technology for augmented reality becomes more affordable, more lightweight, more energy efficient, more stylish, it will be more feasible for more people to use. In the bargain, we’ll get a display technology entirely unshackled from the constraints of a monitor stand. We’ll have much broader canvases subject to the flexibility of digital creativity, collaboration and expression. What this unlocks, we can’t say.
·redeem-tomorrow.com·
Leviathan Wakes: the case for Apple's Vision Pro - Redeem Tomorrow