Found 44 bookmarks
Custom sorting
The product design talent crisis || Matt Ström-Awn, designer-leader
The product design talent crisis || Matt Ström-Awn, designer-leader
In short, managers kick off a feedback loop by trying to close the gap between their team’s current and desired performance. They have two options: 1) Drive short-term improvements by asking more from senior designers, increasing rewards for top performers, and creating upward pressure through reviews, or 2) Build long-term capability by investing in training, coaching, and career development for junior designers. But the feedback loops between these approaches push companies to prioritize hiring senior talent, as the immediate performance gains outweigh the diffuse returns of capability-building.
·matthewstrom.com·
The product design talent crisis || Matt Ström-Awn, designer-leader
Not all AI-assisted programming is vibe coding (but vibe coding rocks)
Not all AI-assisted programming is vibe coding (but vibe coding rocks)
Andrej is an extremely talented and experienced programmer—he has no need for AI assistance at all. He’s using LLMs like this because it’s fun to try out wild new ideas, and the speed at which an LLM can produce code is an order of magnitude faster than even the most skilled human programmers. For low stakes projects and prototypes why not just let it rip? When I talk about vibe coding I mean building software with an LLM without reviewing the code it writes.
If an LLM wrote the code for you, and you then reviewed it, tested it thoroughly and made sure you could explain how it works to someone else that’s not vibe coding, it’s software development. The usage of an LLM to support that activity is immaterial.
The job of a software developer is not (just) to churn out code and features. We need to create code that demonstrably works, and can be understood by other humans (and machines), and that will support continued development in the future. We need to consider performance, accessibility, security, maintainability, cost efficiency. Software engineering is all about trade-offs—our job is to pick from dozens of potential solutions by balancing all manner of requirements, both explicit and implied.
I think vibe coding is the best tool we have to help experienced developers build that intuition as to what LLMs can and cannot do for them. I’ve published more than 80 experiments I built with vibe coding and I’ve learned so much along the way. I would encourage any other developer, no matter their skill level, to try the same.
·simonwillison.net·
Not all AI-assisted programming is vibe coding (but vibe coding rocks)
Vibe Code is Legacy Code
Vibe Code is Legacy Code
As many have pointed out , not all code written with AI assistance is vibe code. Per the original definition , it’s code written in contexts where you “forget that the code even exists.” Or as the fairly fleshed-out Wikipedia article puts it: ”A key part of the definition of vibe coding is that the user accepts code without full understanding.”
Our AI minions are also exceptional tools for learning when you move too far towards the high-vibes-low-understanding end of the spectrum. I particularly like getting Claude to write me targeted exercises to practice new concepts when I get lost in generated functions or fail to implement something correctly sans-AI. Even though doubling-down up on engineering skills sometimes feels like learning to operate a textile loom in 1820.
·maggieappleton.com·
Vibe Code is Legacy Code
Face it: you're a crazy person
Face it: you're a crazy person
Unpacking is a way of re-inflating all the little particulars that had to be flattened so your imagination could produce a quick preview of the future, like turning a napkin sketch into a blueprint
When people have a hard time figuring out what to do with their lives, it’s often because they haven’t unpacked. For example, in grad school I worked with lots of undergrads who thought they wanted to be professors. Then I’d send ‘em to my advisor Dan, and he would unpack them in 10 seconds flat. “I do this,” he would say, miming typing on a keyboard, “And I do this,” he would add, gesturing to the student and himself. “I write research papers and I talk to students. Would you like to do those things?”
more likely, they weren’t picturing anything at all. They were just thinking the same thing over and over again: “Do I want to be a professor? Hmm, I’m not sure. Do I want to be a professor? Hmm, I’m not sure.” Why is it so hard to unpack, even a little bit? Well, you know how when you move to a new place and all of your unpacked boxes confront you every time you come home? And you know how, if you just leave them there for a few weeks, the boxes stop being boxes and start being furniture, just part of the layout of your apartment, almost impossible to perceive? That’s what it’s like in the mind. The assumptions, the nuances, the background research all get taped up and tucked away. That’s a good thing—if you didn’t keep most of your thoughts packed, trying to answer a question like “Do I want to be a professor?” would be like dumping everything you own into a giant pile and then trying to find your one lucky sock.
When you fully unpack any job, you’ll discover something astounding: only a crazy person should do it. Do you want to be a surgeon? = Do you want to do the same procedure 15 times a week for the next 35 years? Do you want to be an actor? = Do you want your career to depend on having the right cheekbones?
High-status professions are the hardest ones to unpack because the upsides are obvious and appealing, while the downsides are often deliberately hidden and tolerable only to a tiny minority.
When you come down from the 30,000-foot view that your imagination offers you by default, when you lay out all the minutiae of a possible future, when you think of your life not as an impressionistic blur, but as a series of discrete Tuesday afternoons full of individual moments that you will live in chronological order and without exception, only then do you realize that most futures make sense exclusively for a very specific kind of person. Dare I say, a crazy person.
We tend to overestimate the prevalence of our preferences, a phenomenon that psychologists call the “false consensus effect”3. This is probably because it’s really really hard to take other people’s perspectives, so unless we run directly into disconfirming evidence, we assume that all of our mental settings are, in fact, the defaults. Our idiosyncrasies may never even occur to us.
whenever you unpack somebody, you inevitably discover something extremely weird about them. Sometimes you don’t have to dig that far, like when your friend tells you that she likes “found” photographs—the abandoned snapshots that turn up at yard sales and charity shops—and then adds that she has collected 20,000 of them. But sometimes the craziness is buried deep, often because people don’t think it’s crazy at all, like when a friend I knew for years casually disclosed that she had dumped all of her previous boyfriends because they had been insufficiently “menacing”
This is why people get so brain-constipated when they try to choose a career, and why they often pick the wrong one: they don’t understand the craziness that they have to offer, nor the craziness that will be demanded of them, and so they spend their lives jamming their square-peg selves into round-hole jobs.
On the other hand, when people match their crazy to the right outlet, they become terrifyingly powerful. A friend from college recently reminded me of this guy I’ll call Danny, who was crazy in a way that was particularly useful for politics, namely, he was incapable of feeling humiliated.
Unpacking is easy and free, but almost no one ever does it because it feels weird and unnatural. It’s uncomfortable to confront your own illusion of explanatory depth, to admit that you really have no idea what’s going on, and to keep asking stupid questions until that changes.
Making matters worse, people are happy to talk about themselves and their jobs, but they do it at this unhelpful, abstract level where they say things like, “oh, I’m the liaison between development and sales”. So when you’re unpacking someone’s job, you really gotta push: what did you do this morning? What will you do after talking to me? Is that what you usually do? If you’re sitting at your computer all day, what’s on your computer? What programs are you using? Wow, that sounds really boring, do you like doing that, or do you endure it?
It’s no wonder that everyone struggles to figure what to do with their lives: we have not developed the cultural technology to deal with this problem because we never had to. We didn’t exactly evolve in an ancestral environment with a lot of career opportunities. And then, once we invented agriculture, almost everyone was a farmer the next 10,000 years. “What should I do with my life?” is really a post-1850 problem, which means, in the big scheme of things, we haven’t had any time to work on it.
·experimental-history.com·
Face it: you're a crazy person
I Deleted My Second Brain
I Deleted My Second Brain
For years, I had been building what technologists and lifehackers call a “second brain.” The premise: capture everything, forget nothing. Store your thinking in a networked archive so vast and recursive it can answer questions before you know to ask them. It promises clarity. Control. Mental leverage. But over time, my second brain became a mausoleum. A dusty collection of old selves, old interests, old compulsions, piled on top of each other like geological strata. Instead of accelerating my thinking, it began to replace it. Instead of aiding memory, it froze my curiosity into static categories.
The modern PKM (Personal Knowledge Management) movement traces its roots through para-academic obsessions with systems theory, Luhmann’s Zettelkasten, and the Silicon Valley mythology of productivity as life. Roam Research turned bidirectional links into a cult. Obsidian let the cult go off-grid. The lore deepened. You weren’t taking notes. You were building a lattice of meaning. A library Borges might envy.
n “The Library of Babel,” he imagines an infinite library containing every possible book. Among its volumes are both perfect truth and perfect gibberish. The inhabitants of the library, cursed to wander it forever, descend into despair, madness, and nihilism. The map swallows the territory.
The more I wrote into my vault, the less I felt. A quote would spark an insight, I’d clip it, tag it, link it - and move on. But the insight was never lived. It was stored. Like food vacuum-sealed and never eaten, while any nutritional value slips away.
Worse, the architecture began to shape my attention. I started reading to extract. Listening to summarize. Thinking in formats I could file. Every experience became fodder.
Human memory is not an archive. It is associative, embodied, contextual, emotional. We do not think in folders.
Merlin Donald, in his theory of cognitive evolution, argues that human intelligence emerged not from static memory storage but from external symbolic representation: tools like language, gesture, and writing that allowed us to rehearse, share, and restructure thought. Culture became a collective memory system - not to archive knowledge, but to keep it alive, replayed, and reworked. In trying to remember everything, I outsourced the act of reflection. I didn’t revisit ideas. I didn’t interrogate them. I filed them away and trusted the structure.
I basically agree with all of this but don't think any of this changes that the systems are what you make of them—the idea behind evergreen note taking and "tending to your notes" involves [effortful engagement](https://notes.andymatuschak.org/Understanding_requires_effortful_engagement)
·joanwestenberg.com·
I Deleted My Second Brain
More stray observations — on Liquid Glass, on Apple’s lack of direction, then zooming out, on technological progress | Riccardo Mori
More stray observations — on Liquid Glass, on Apple’s lack of direction, then zooming out, on technological progress | Riccardo Mori
This Apple has been dismantling Mac OS, as if it’s a foreign tool to them. They’ve bashed its UI around. And they seem to have done that not for the purpose of improving it, but simply for the purpose of changing it; adapting it to their (mostly misguided) idea of unifying the interface of different devices to bring it down to the simplest common denominator.
f we look at Mac OS as a metro railway line, it’s like Apple has stopped extending it and creating new stations. What they’ve been doing for a while now has been routine maintenance, and giving the stations a fresh coat of paint every year. Only basic and cosmetic concerns, yet sometimes mixing things up to show that more work has gone into it, a process that invariably results in inexplicable and arbitrary choices like moving station entrances around, shutting down facilities, making the train timetables less legible, making the passages that lead to emergency exits more convoluted and longer to traverse, and so on — hopefully you know what I mean here.
When you self-impose timelines and cadences that are essentially marketing-driven and do not really reflect technological research and development, then you become prisoner in a prison of your own making. Your goal and your priorities start becoming narrower in scope. You reduce your freedom of movement because you stop thinking in terms of creating the next technological breakthrough or innovative device; you just look at the calendar and you have to come up with something by end of next trimester, while you also have to take care of fixing bugs that are the result of the previous rush job… which keep accumulating on top of the bugs of the rush job that came before, and so forth.
From what I’ve understood by examining the evolution of computer science and computer history, scientists and technologists of past decades seemed to have an approach that could be described as, ‘ideas & concepts first, technology later’. Many figures in the history of computing are rightly considered visionaries because they had visions — sometimes very detailed ones — of what they wanted computers to become, of applications where computers could make a difference, of ways in which a computer could improve a process, or could help solve a real problem.
What I’m seeing today is more like the opposite approach — ‘technology first, ideas & concepts later’: a laser focus on profit-driven technological advancements to hopefully extract some good ideas and use cases from.
Where there are some ideas, or sparks, they seem hopelessly limited in scope or unimaginatively iterative, short-sightedly anchored to the previous incarnation or design. The questions are something like, How can we make this look better, sleeker, more polished?
Steve Jobs once said, There’s an old Wayne Gretzky quote that I love. ‘I skate to where the puck is going to be, not where it has been.’ And we’ve always tried to do that at Apple. Since the very, very beginning. And we always will. If I may take that image, I’d say that today a lot of tech companies seem more concerned with the skating itself and with continuing to hit the puck in profitable ways.
·morrick.me·
More stray observations — on Liquid Glass, on Apple’s lack of direction, then zooming out, on technological progress | Riccardo Mori
Habits, UI changes, and OS stagnation | Riccardo Mori
Habits, UI changes, and OS stagnation | Riccardo Mori
We have been secretly, for the last 18 months, been designing a completely new user interface. And that user interface builds on Apple’s legacy and carries it into the next century. And we call that new user interface Aqua, because it’s liquid. One of the design goals was that when you saw it you wanted to lick it. But it’s important to remember that this part came several minutes after outlining Mac OS X’s underlying architecture. Jobs began talking about Mac OS X by stating its goals, then the architecture used to attain those goals, and then there was a mention of how the new OS looked.
Sure, a lot has changed in the technology landscape over the past twenty years, but the Mac OS X introduction in 2000 is almost disarming in how clearly and precisely focused it is. It is framed in such a way that you understand Jobs is talking about a new powerful tool. Sure, it also looks cool, but it feels as if it’s simply a consequence of a grander scheme. A tool can be powerful in itself, but making it attractive and user-friendly is a crucial extension of its power.
But over the years (and to be fair, this started to happen when Jobs was still CEO), I’ve noticed that, iteration after iteration, the focus of each introduction of a new version of Mac OS X shifted towards more superficial features and the general look of the system. As if users were more interested in stopping and admiring just how gorgeous Mac OS looks, rather than having a versatile, robust and reliable foundation with which to operate their computers and be productive.
What some geeks may be shocked to know is that most regular people don’t really care about these changes in the way an application or operating system looks. What matters to them is continuity and reliability. Again, this isn’t being change-averse. Regular users typically welcome change if it brings something interesting to the table and, most of all, if it improves functionality in meaningful ways. Like saving mouse clicks or making a multi-step workflow more intuitive and streamlined.
But making previous features or UI elements less discoverable because you want them to appear only when needed (and who decides when I need something out of the way? Maybe I like to see it all the time) — that’s not progress. It’s change for change’s sake. It’s rearranging the shelves in your supermarket in a way that seems cool and marketable to you but leaves your customers baffled and bewildered.
This yearly cycle forces Apple engineers — and worse, Apple designers — to come up with ‘new stuff’, and this diverts focus from fixing underlying bugs and UI friction that inevitably accumulate over time.
Microsoft may leave entire layers of legacy code in Windows, turning Windows into a mastodontic operating system with a clean surface and decades of baggage underneath. Apple has been cleaning and rearranging the surface for a while now, and has been getting rid of so much baggage that they went to the other extreme. They’ve thrown the baby out with the bathwater, and Mac OS’s user interface has become more brittle after all the changes and inconsistent applications of those Human Interface Guidelines that have informed good UI design in Apple software for so long.
Meanwhile the system hasn’t really gone anywhere. On mobile, iOS started out excitingly, and admittedly still seems to be moving in an evolving trajectory, but on the iPad’s front there has been a lot of wheel reinventing to make the device behave more like a traditional computer, instead of embarking both the device and its operating system in a journey of revolution and redefinition of the tablet experience in order to truly start a ‘Post-PC era’.
An operating system is something that shouldn’t be treated as an ‘app’, or as something people should stop and admire for its æsthetic elegance, or a product whose updates should be marketed as if it’s the next iPhone iteration. An operating system is something that needs a separate, tailored development cycle. Something that needs time so that you can devise an evolution plan about it; so that you can keep working on its robustness by correcting bugs that have been unaddressed for years, and present features that really improve workflows and productivity while building organically on what came before. This way, user-facing UI changes will look reasonable, predictable, intuitive, easily assimilable, and not just arbitrary, cosmetic, and of questionable usefulness.
·morrick.me·
Habits, UI changes, and OS stagnation | Riccardo Mori
Something Is Rotten in the State of Cupertino
Something Is Rotten in the State of Cupertino
Who decided these features should go in the WWDC keynote, with a promise they’d arrive in the coming year, when, at the time, they were in such an unfinished state they could not be demoed to the media even in a controlled environment? Three months later, who decided Apple should double down and advertise these features in a TV commercial, and promote them as a selling point of the iPhone 16 lineup — not just any products, but the very crown jewels of the company and the envy of the entire industry — when those features still remained in such an unfinished or perhaps even downright non-functional state that they still could not be demoed to the press? Not just couldn’t be shipped as beta software. Not just couldn’t be used by members of the press in a hands-on experience, but could not even be shown to work by Apple employees on Apple-controlled devices in an Apple-controlled environment? But yet they advertised them in a commercial for the iPhone 16, when it turns out they won’t ship, in the best case scenario, until months after the iPhone 17 lineup is unveiled?
“Can anyone tell me what MobileMe is supposed to do?” Having received a satisfactory answer, he continued, “So why the fuck doesn’t it do that?” For the next half-hour Jobs berated the group. “You’ve tarnished Apple’s reputation,” he told them. “You should hate each other for having let each other down.” The public humiliation particularly infuriated Jobs. Walt Mossberg, the influential Wall Street Journal gadget columnist, had panned MobileMe. “Mossberg, our friend, is no longer writing good things about us,” Jobs said. On the spot, Jobs named a new executive to run the group. Tim Cook should have already held a meeting like that to address and rectify this Siri and Apple Intelligence debacle. If such a meeting hasn’t yet occurred or doesn’t happen soon, then, I fear, that’s all she wrote. The ride is over. When mediocrity, excuses, and bullshit take root, they take over. A culture of excellence, accountability, and integrity cannot abide the acceptance of any of those things, and will quickly collapse upon itself with the acceptance of all three.
·daringfireball.net·
Something Is Rotten in the State of Cupertino
The Workbench Dispatch: 009
The Workbench Dispatch: 009
Mark Sabino argues that a significant cultural "Vibe Shift" has occurred, marked by regression and conservatism that wasn't sudden but built gradually through passive consumption habits and phrases like "Let people enjoy things" and "It's not that deep," which have undermined critical thinking and enabled regressive mindsets
His worlds can be isolated, smothering, grating, haunting, bleak, warm, familiar, alien, all at once. They are never dull and they are never someone else’s. Never sacrificing his weirdness, his staunch outlooks on life, or his vision, you can feel his fingerprints on everything he made, because nobody else possibly could have. Sometimes annoying with how opaquely abstract they can be, but never in a cynical way. He had a laser precise understanding of his control over an audience, and explored all the extremes that come with that. He wove dark, brutal, sometimes cruel tapestries of our own psyches and displayed them back to us with white glove care.
Despite being viewed as abstract or avant garde, there is an inescapable Americana to his work, with all of its horrific blemishes and stunning beauty, hand in hand just like the country itself.
Lynch’s art is uncomfortable, uncompromising, but never uncaring. Frigid surreality that could only be a product of warm humanity. Darkness will always coupled with light. After all, nightmares are still dreams.
I would argue that the core tenets of the average American consumer mindset in 2025, the perfect encapsulations of the noxious attitudes that led us to where we are now, come in the form of two particular phrases that have been parroted ad nauseam the last few years.
The first of which is the classic, “Let people enjoy things.” Deconstructing it, it really defines the entire first half of the decade in more ways than one. An invisible straw-man evil big Other that somehow controls whether or not people can “have fun”, a childish temper tantrum thrown by people still getting what they want caused by having to face any form of critical thinking for doing so, a shrieking demand for more pacification, it really has it all. Is it fine for people to have hobbies and interests and passions that don’t align with yours? Absolutely.
There is a large, crucial difference between “letting someone” enjoy something, and negligently allowing something toxic to fester and gradually spread untreated like ignored black mold. Our modern narcissism and individualism have made people so entrenched in their demands for consumption, that it’s hard to imagine who even is not letting people enjoy things at this point.
The all-but-hedonistic behavior of our modern day certainly doesn’t reflect a culture of people not being allowed to enjoy things, but rather one that wants to be able to enjoy things without having to think about it. Any sort of opposing belief, or conscious step back from the raging maw of consumption is met with complete indignation, as if their right to slop is being infringed upon.
Alternatively, chances for maturation or growth get flippantly put off for some other time that never comes, a complete refusal to actually analyze our relationship to the way we operate.
This brings us to the second defining phrase of the times that I’d like to break down, one that is constantly coupled with the former, the oft-repeated, aggressively vapid, “It’s not that deep.”
Part reaction to the supposed “intellectualism” and woke-ism of the 2010s, part rejection of personal responsibility for one’s own habits and actions, it most succinctly sums up the prevailing attitudes that have dictated the course of the Biden and now Trump 2.0 years. If our beautiful and twisted history has taught us anything, it’s that things are usually always that deep, but somehow we’ve began plugging our ears to that fact. It is a frankly dangerous indicator of the median population’s attitude towards growth or challenging oneself in any way.
In the realm of media, this rejection of the inherent depth of things has completely altered people’s understanding about those things. If one’s own scope of something is minimized, anything outside of that scope is easier to be written off as antagonistic, foreign, pretentious, or any other label that leads to dismissal. Valid, formal criticism, (sometimes even from a place of love!), gets brushed off as “hating” because the idea that someone thought about something in a deeper way and wasn’t pleased with what they found is abrasive to those unwilling to explore that same level of depth.
Additionally, this phrase has been the perfect excuse as evil rhetorics are unconsciously spread through seemingly innocuous or lighthearted means. “It’s just a meme, it’s not that deep.” quickly turns to “How did this propaganda spread so fast?”. Through the first 5 years of our decade, we have gradually let it become defined by half gestures and “meh” reactions, a drab grey monocultural sludge, and then have the audacity to wonder how it got that way. We let it slip away ourselves through embracing memetic psyops, “gotta hand it to ‘em”s and "letting people have fun”. Well now they’re having their fun, the question is, do you think they’ll return that favor to you?
Giant swaths of the population have both figuratively and literally thrown their masks away, and are perfectly dumbed down and pacified to be absolutely steamrolled by a whole new wave of regression and recession.
At the time of writing this, Tiktok has been banned and subsequently hours later unbanned, all with Donald Trump’s name fully plastered over the entire ordeal, in what can only come across as a very obvious ploy to swing more gullible idiots into supporting him. The problem with this blatant grab to try and become a hero of a ban that he initially pushed for however, is that it’s working scarily well. The tectonic shift that has been building steadily throughout the course of the failure of the Biden era has finally come for its biggest payoff yet. Capitalizing on people’s COVID fried, goldfish sized memories in order to continue to innocuously shift people right into submission.
The biggest takeaway from the election and gradual Vibe Shift is the powers that be realizing they had more numbers than they thought, that the middle of the bell curve is infinitely more manipulatable than expected. Either directly through propaganda, or indirectly through desensitization via prolonged exposure to the most concentrated, hallucinogenic stupidity available.
If a gun were being pointed in our face, why would we argue that it’s only harmful if someone pulled the trigger.
Another noticeable symptom of this mode of behavior we have fallen into is the warping of what used to be considered “playing devil’s advocate”, and how it has impacted the way we digest and talk about art.
Similar to the attitudes surrounding fast fashion, somewhere along the line people stopped caring about trying to be better than the Mall, even going so far as to fight on the Mall’s behalf out of pure, empty contrarianism. Popularity took the reins as the de-facto measurement of quality, the belief was planted that the mainstream has our artistic best interests in mind, and people militantly ride for that belief despite decades of proof of the opposite. Not knowing nor caring that they’re secretly advocating for overall worse quality of experiences for themselves.
Too many people want to play devil’s advocate but don’t possess the depth of knowledge, the insight, or nuance to do so, so they wind up just playing devil instead, blindly defending degradation rather than express a bit of concern for the way things are going.
It has brought us to where we are now, a legion of people ready to die on the hill of slop, so as not to make any ripples, without even wanting to know if there can be anything better than the lowest common denominator what was shoved down their throats. Taking the sides of the rich people and giant brands that want to give the consumer nothing above mediocrity. These people and places don’t deserve our benefit of the doubt, because they’ve already won.
Vehemently and vocally rejecting that mainstream and embracing what we know to actually be cool. The time for passivity is over, because this continued sliding by the mainstream is active. We know we can be smarter, more conscious consumers, aware of what’s better than the mall or the radio or the pointed propagandized memes on tiktok. We know there’s more rich experiences to be had, art to discover, statements to make, ways to expand our thought that will not be presented to us on a silver platter by giant corporations or industry machines. We can speak with our eyes, ears, voices, and most importantly wallets. If something sucks, say it and stand on it, because it is far too easy now to succumb to the “well everybody’s doing it” mentality.
My tolerance for bad faith devil’s advocate arguments that only contribute to spin the wheels of progress in place is gone. We have only a short amount of time on this earth and I don’t intend to waste it watching that window of opportunity be pissed away by someone else.
Every time you open your mouth is an opportunity to say something new, something of worth, and I do not want to waste even one moment. It’s time to get serious and realize yes it is that deep. It always has been. I can’t say for certain exactly what this counter-culture will manifest as or even look like specifically, but I do have faith that something can and will emerge. There is far too much talent, energy, emotion, conviction, and spirit out there to not.
·marksnotnice.substack.com·
The Workbench Dispatch: 009
We Don't Need More Cynics. We Need More Builders.
We Don't Need More Cynics. We Need More Builders.
Anyone can point at something and say it’s broken, corrupt, or destined to fail. The real challenge? Building something better. The cynic sees a proposal for change and immediately lists why it won’t work. They’re usually right about specific failure modes — systems are complex, and failure has many mothers. But being right about potential problems differs from being right about the whole.
The cynical position feels sophisticated. It signals worldliness, experience, and a certain battle-hardened wisdom. “Oh, you sweet summer child,” the cynic says, “I’ve seen how these things really work.” But what if this sophistication is itself a form of naïveté?
Cynicism comes with hidden taxes. Every time we default to assuming the worst, we pay in missed opportunities, reduced social trust, and diminished creative capacity. These costs compound over time, creating a self-fulfilling prophecy in which cynical expectations shape cynical realities.
Pattern recognition is valuable — we should learn from history and past failures. But pattern recognition becomes pattern imprisonment when it blinds us to genuinely new possibilities.
Why spend years building something that could fail when you could spend an afternoon critiquing others’ attempts and look just as smart? The cynical stance is intellectually rewarding but culturally corrosive.
The alternative to cynicism isn’t unquestioning optimism. It’s more nuanced: a clear-eyed recognition of problems coupled with the conviction that improvement is possible. Call it pragmatic meliorism — the belief that while perfect solutions may not exist, better ones do.
things are broken, AND they can be fixed; people are flawed AND capable of growth; systems are complex AND can be improved.
Here’s a more charitable reading of cynicism: it’s not an intellectual position. It’s an emotional defense mechanism. If you expect the worst, you’ll never be disappointed. If you assume everything is corrupt, you can’t be betrayed. But this protection comes at a terrible price. The cynic builds emotional armor that also functions as a prison, keeping out not just pain but also possibility, connection, and growth.
Not all domains benefit equally from cynical analysis. Some areas — scientific investigation, financial planning, and security systems — benefit from rigorous skepticism. Others — creative endeavors, relationship building, social movements — often suffer from it.
What would it look like to embrace pragmatic meliorism instead of cynicism? Acknowledging problems while focusing on solutions Learning from history without being imprisoned by it Maintaining high standards while accepting incremental progress Combining skeptical analysis with constructive action
When you feel the pull of cynicism, ask yourself: Is this helping? Is this default skepticism making you more effective or just more comfortable? Are you choosing the easy path of criticism over the harder path of creation?
·joanwestenberg.com·
We Don't Need More Cynics. We Need More Builders.
Your "Per-Seat" Margin is My Opportunity
Your "Per-Seat" Margin is My Opportunity

Traditional software is sold on a per seat subscription. More humans, more money. We are headed to a future where AI agents will replace the work humans do. But you can’t charge agents a per seat cost. So we’re headed to a world where software will be sold on a consumption model (think tasks) and then on an outcome model (think job completed) Incumbents will be forced to adapt but it’s classic innovators dilemma. How do you suddenly give up all that subscription revenue? This gives an opportunity for startups to win.

Per-seat pricing only works when your users are human. But when agents become the primary users of software, that model collapses.
Executives aren't evaluating software against software anymore. They're comparing the combined costs of software licenses plus labor against pure outcome-based solutions. Think customer support (per resolved ticket vs. per agent + seat), marketing (per campaign vs. headcount), sales (per qualified lead vs. rep). That's your pricing umbrella—the upper limit enterprises will pay before switching entirely to AI.
enterprises are used to deterministic outcomes and fixed annual costs. Usage-based pricing makes budgeting harder. But individual leaders seeing 10x efficiency gains won't wait for procurement to catch up. Savvy managers will find ways around traditional buying processes.
This feels like a generational reset of how businesses operate. Zero upfront costs, pay only for outcomes—that's not just a pricing model. That's the future of business.
The winning strategy in my books? Give the platform away for free. Let your agents read and write to existing systems through unstructured data—emails, calls, documents. Once you handle enough workflows, you become the new system of record.
·writing.nikunjk.com·
Your "Per-Seat" Margin is My Opportunity
Fight Theory
Fight Theory
Polls show that many of the policies enacted by President Biden are popular. His measures to reduce the cost of insulin and other drugs receive support from more than 80 percent of Americans. His infrastructure bill, his hawkish approach to China and his all-of-the-above energy policy, which combines expanded oil drilling with clean-energy subsidies, are popular, too. But voters obviously like some of his policies more than others. And an unusual pattern seems to be hurting Biden’s re-election campaign: Voters are less aware of his most popular policies than his more divisive ones.
Adam Green, co-founder of Progressive Change Campaign Committee, a Democratic-aligned group, blames what he calls fight theory. “It’s not enough to have positive messaging,” Green said. “Voters must see drama, clash and an ongoing saga in order for our message to break through a cluttered news environment.”
fights become the subject of political fundraising emails, activist campaigns, news stories and social media posts. Conflict attracts attention. The situation with Biden’s most popular economic policies — especially the reduction of medical costs — is somewhat different.
·nytimes.com·
Fight Theory
‘The Interview’: Nancy Pelosi Insists the Election Was Not a Rebuke of the Democrats
‘The Interview’: Nancy Pelosi Insists the Election Was Not a Rebuke of the Democrats
I don’t think we were clear enough by saying fewer people came in under President Biden than came under Donald Trump. It’s clarity of the message, and if that’s what Bernie’s talking about, and that’s what Joe Manchin’s talking about, we weren’t clear in our message as to what things are, then I agree with that. And that was one of the concerns I expressed about saying we haven’t put forth what was done. It’s our legacy, too. [Pelosi bangs on the table.] The rescue package. [Pelosi bangs on the table.] Infrastructure Bill. [Pelosi bangs on the table again.] The CHIPS Act. But that didn’t come across as well as it should have. So I think if you’re talking about messaging, you’re talking about communications, that’s one thing. If you’re talking about what we stand for versus what they stand for, the public’s in for a big surprise.
I think that any vice president is, like it or not, tied to the record of the president. I think what Biden did was great, and being tied to his record is a great thing but not the way the record was perceived. This is a record of job creation. Sixteen million jobs as opposed to the record of her opponent who had the worst job-creation record since Herbert Hoover. Yes, 16 million jobs, turning around inflation, all the things that we did to build the infrastructure of America, reduce the cost of prescription drugs.
President Trump has promised to use the Justice Department and the attorney general to go after his perceived enemies. He has said that over and over again, and you’re one of them. Well, you would think that that would be enough reason for people not to vote for him. But that’s what he said. So when people say to me, “Why do you think our democracy is in danger?” I’ll say, well, let’s define our democracy. What is democracy? Free and fair elections? Peaceful transfer of power, independence of the judiciary, the rule of law, all of those kinds of things are part of a democracy. So if he’s going after those things, and thank God, the only, shall we say, peace of mind that we have today is that we don’t have the assault on the system that would have been there had Kamala Harris won. That isn’t right. It shouldn’t be that way. And that he would say — maybe thought it, might even want to do it, but to say it and the American people will say, “That’s OK with me ”?
·nytimes.com·
‘The Interview’: Nancy Pelosi Insists the Election Was Not a Rebuke of the Democrats
Bernie Would Have Won
Bernie Would Have Won

AI summary: This article argues that Trump's 2024 victory represents the triumph of right-wing populism over neoliberalism, enabled by Democratic Party leadership's deliberate suppression of Bernie Sanders' left-wing populist movement. The piece contends that by rejecting class-focused politics in favor of identity politics and neoliberal policies, Democrats created a vacuum that Trump's authoritarian populism filled.

Here’s a warning and an admonition written in January 2019 by author and organizer Jonathan Smucker: “If the Dem Party establishment succeeds in beating down the fresh leadership and bold vision that's stepping up, it will effectively enable the continued rise of authoritarianism. But they will not wake up and suddenly grasp this. It's on us to outmaneuver them and win.”
There are a million surface-level reasons for Kamala Harris’s loss and systematic underperformance in pretty much every county and among nearly every demographic group. She is part of a deeply unpopular administration. Voters believe the economy is bad and that the country is on the wrong track. She is a woman and we still have some work to do as a nation to overcome long-held biases.  But the real problems for the Democrats go much deeper and require a dramatic course correction of a sort that, I suspect, Democrats are unlikely to embark upon. The bottom line is this: Democrats are still trying to run a neoliberal campaign in a post-neoliberal era. In other words, 2016 Bernie was right.
The lie that fueled the Iraq war destroyed confidence in the institutions that were the bedrock of this neoliberal order and in the idea that the U.S. could or should remake the world in our image. Even more devastating, the financial crisis left home owners destitute while banks were bailed out, revealing that there was something deeply unjust in a system that placed capital over people.
These events sparked social movements on both the right and the left. The Tea Party churned out populist-sounding politicians like Sarah Palin and birtherist conspiracies about Barack Obama, paving the way for the rise of Donald Trump. The Tea Party and Trumpism are not identical, of course, but they share a cast of villains: The corrupt bureaucrats or deep state. The immigrants supposedly changing your community. The cultural elites telling you your beliefs are toxic. Trump’s version of this program is also explicitly authoritarian. This authoritarianism is a feature not a bug for some portion of the Trump coalition which has been persuaded that democracy left to its own devices could pose an existential threat to their way of life.
On the left, the organic response to the financial crisis was Occupy Wall Street, which directly fueled the Bernie Sanders movement. Here, too, the villains were clear. In the language of Occupy it was the 1% or as Bernie put it the millionaires and billionaires. It was the economic elite and unfettered capitalism that had made it so hard to get by. Turning homes into assets of financial speculation. Wildly profiteering off of every element of our healthcare system. Busting unions so that working people had no collective power. This movement was, in contrast to the right, was explicitly pro-democracy, with a foundational view that in a contest between the 99% and the 1%, the 99% would prevail. And that a win would lead to universal programs like Medicare for All, free college, workplace democracy, and a significant hike in the minimum wage.
On the Republican side, Donald Trump emerged as a political juggernaut at a time when the party was devastated and rudderless, having lost to Obama twice in a row. This weakened state—and the fact that the Trump alternatives were uncharismatic drips like Jeb Bush—created a path for Trump to successfully execute a hostile takeover of the party.
Plus, right-wing populism embraces capital, and so it posed no real threat to the monied interests that are so influential within the party structures.
The Republican donor class was not thrilled with Trump’s chaos and lack of decorum but they did not view him as an existential threat to their class interests
The difference was that Bernie’s party takeover did pose an existential threat—both to party elites who he openly antagonized and to the party’s big money backers. The bottom line of the Wall Street financiers and corporate titans was explicitly threatened. His rise would simply not be allowed. Not in 2016 and not in 2020.
What’s more, Hillary Clinton and her allies launched a propaganda campaign to posture as if they were actually to the left of Bernie by labeling him and his supporters sexist and racist for centering class politics over identity politics. This in turn spawned a hell cycle of woke word-policing and demographic slicing and dicing and antagonism towards working class whites that only made the Democratic party more repugnant to basically everyone.
The path not taken in 2016 looms larger than ever. Bernie’s coalition was filled with the exact type of voters who are now flocking to Donald Trump: Working class voters of all races, young people, and, critically, the much-derided bros. The top contributors to Bernie’s campaign often held jobs at places like Amazon and Walmart. The unions loved him. And—never forget—he earned the coveted Joe Rogan endorsement that Trump also received the day before the election this year. It turns out, the Bernie-to-Trump pipeline is real! While that has always been used as an epithet to smear Bernie and his movement, with the implication that social democracy is just a cover for or gateway drug to right wing authoritarianism, the truth is that this pipeline speaks to the power and appeal of Bernie’s vision as an effective antidote to Trumpism. When these voters had a choice between Trump and Bernie, they chose Bernie. For many of them now that the choice is between Trump and the dried out husk of neoliberalism, they’re going Trump.
Maybe I will be just as wrong as I was about the election but it is my sense that with this Trump victory, authoritarian right politics have won the ideological battle for what will replace the neoliberal order in America. And yes, I think it will be ugly, mean, and harmful—because it already is.
·dropsitenews.com·
Bernie Would Have Won
Nike: An Epic Saga of Value Destruction | LinkedIn
Nike: An Epic Saga of Value Destruction | LinkedIn
Things seemed to go well at the beginning. Due to the pandemic and the objective challenges of the traditional Brick & Mortar business, the business operated by Nike Direct (the business unit in charge of DTC) was flying and justifying the important strategic decisions of the CEO. Then, once normality came back, things slowly but regularly, quarter by quarter, showed that the separation line between being ambitious or being wrong was very thin.
In 6 months, hundreds of colleagues were fired and together with them Nike lost a solid process and thousands of years of experience and expertise in running, football, basketball, fitness, training, sportwear, etc., built in decades of footwear leadership (and apparel too). Product engine became gender led: women, men, and kids (like Zara, GAP, H&M or any other generic fashion brand).
Consumers are not so elastic as some business leaders think or hope. And consumers are not so loyal as some business leaders think or hope. So, what happened? Simple. Many consumers - mainly occasional buyers - did not follow Nike (surprise, surprise) but continued shopping where they were shopping before the decision of the CEO and the President of the Brand. So, once they could not find Nike sneakers in “their” stores – because Nike wasn’t serving those stores any longer -, they simply opted for other brands.
Until late 2010s, Nike had been on a total offense mode (being #1 in every market, in every category, in every product BU, basically in every dimension), a sort of military occupation of the marketplace and a huge problem for competitors that did not know how to react under such a domination. The strategic focus was only one: win anywhere. The new strategy determined the end of the marketplace occupation. Nike opened unexpected spaces to competitors, small, medium, or large brands (with exception of the company based in Herzogenaurach, that – as they usually do - copied and pasted the Nike strategy and executed it in a milder format).
One of the empiric laws of business says that online, the main lever of competition is “price” (as the organic consumer funnel is built on price comparison). The proverbial ability of Nike to leverage the power of the brand to sell sneakers at 200$ began to be threatened by the online appetite for discounts and the search for a definitive solution to the inventory issue. Gross margin – because of that – instead of growing due to the growth of DTC business, showed a rapid decline due to a never-ending promotional attitude on Nike.com
Nike has been built for 50 years on a very simple foundation: brand, product, and marketplace. The DC Investment model, since Nike became a public company, has been always the same: invest at least one tenth of the revenues in demand creation and sports marketing. The brand model has been very simple as well: focus on innovation and inspiration, creativity and storytelling based on athletes-products synergy, leveraging the power of the emotions that sport can create, trying to inspire a growing number of athletes* (*if you have a body, you are an athlete) to play sport. That’s what made Nike the Nike we used to know, love, admire, professionally and emotionally.
What happened in 2020? Well, the brand team shifted from brand marketing to digital marketing and from brand enhancing to sales activation.
shift from CREATE DEMAND to SERVE AND RETAIN DEMAND, that meant that most of the investment were directed to those who were already Nike consumers
as of 2021, to drive traffic to Nike.com, Nike started investing in programmatic adv and performance marketing the double or more of the share of resources usually invested in the other brand activities
the former CMO was ignoring the growing academic literature around the inefficiencies of investment in performance marketing/programmatic advertising, due to frauds, rising costs of mediators and declining consumer response to those activities.
Because of that, Nike invested a material amount of dollars (billions) into something that was less effective but easier to be measured vs something that was more effective but less easy to be measured.
To feed the digital marketing ecosystem, one of the historic functions of the marketing team (brand communications) was “de facto” absorbed and marginalized by the brand design team, which took the leadership in marketing content production (together with the mar-tech “scientists”). Nike didn’t need brand creativity anymore, just a polished and never stopping supply chain of branded stuff.
He made “Nike.com” the center of everything and diverted focus and dollars to it. Due to all of that, Nike hasn’t made a history making brand campaign since 2018, as the Brand organization had to become a huge sales activation machine.
·linkedin.com·
Nike: An Epic Saga of Value Destruction | LinkedIn
Culture Needs More Jerks | Defector
Culture Needs More Jerks | Defector
The function of criticism is and has always been to complicate our sense of beauty. Good criticism of music we love—or, occasionally, really hate—increases the dimensions and therefore the volume of feeling. It exercises that part of ourselves which responds to art, making it stronger.
The correction to critics’ failure to take pop music seriously is known as poptimism: the belief that pop music is just as worthy of critical consideration as genres like rock, rap or, god forbid, jazz. In my opinion, this correction was basically good. It’s fun and interesting to think seriously about music that is meant to be heard on the radio or danced to in clubs, the same way it is fun and interesting to think about crime novels or graphic design. For the critic, maybe more than for anyone else, it is important to remember that while a lot of great stuff is not popular, popular stuff can be great, too.
every good idea has a dumber version of itself on the internet. The dumb version of poptimism is the belief that anything sufficiently popular must be good. This idea is supported by certain structural forces, particularly the ability, through digitization, to count streams, pageviews, clicks, and other metrics so exactly that every artist and the music they release can be assigned a numerical value representing their popularity relative to everything else. The answer to the question “What do people like?” is right there on a chart, down to the ones digit, conclusively proving that, for example, Drake (74,706,786,894 lead streams) is more popular than The Weeknd (56,220,309,818 lead streams) on Spotify.
The question “What is good?” remains a matter of disagreement, but in the face of such precise numbers, how could you argue that the Weeknd was better? You would have to appeal to subjective aesthetic assessments (e.g. Drake’s combination of brand-checking and self-pity recreates neurasthenic consumer culture without transcending it) or socioeconomic context (e.g. Drake is a former child actor who raps about street life for listeners who want to romanticize black poverty without hearing from anyone actually affected by it, plus he’s Canadian) in a way that would ultimately just be your opinion. And who needs one jerk’s opinion when democracy is right there in the numbers?
This attitude is how you get criticism like “Why Normal Music Reviews No Longer Make Sense for Taylor Swift,” which cites streaming data (The Tortured Poets Department’s 314.5 million release-day streams versus Cowboy Carter’s 76.6 million) to argue that Swift is better understood not as a singer-songwriter but as an area of brand activity, along the lines of the Marvel Cinematic Universe or Star Wars. “The tepid music reviews often miss the fact that ‘music’ is something that Swift stopped selling long ago,” New Yorker contributor Sinéad O’Sullivan writes. “Instead, she has spent two decades building the foundation of a fan universe, filled with complex, in-sequence narratives that have been contextualized through multiple perspectives across eleven blockbuster installments. She is not creating standalone albums but, rather, a musical franchise.”
The fact that most cognitively normal adults regard these bands as children’s music is what makes their fan bases not just ticket-buyers but subcultures.
The power of the antagonist-subculture dynamic was realized by major record labels in the early 1990s, when the most popular music in America was called “alternative.”
For the person who is not into music—the person who just happens to be rapturously committed to the artists whose music you hear everywhere whether you want to or not, whose new albums are like iPhone releases and whose shows are like Disneyland—the critic is a foil.
·defector.com·
Culture Needs More Jerks | Defector
The Return of Ta-Nehisi Coates
The Return of Ta-Nehisi Coates
That it was complicated, he now understood, was “horseshit.” “Complicated” was how people had described slavery and then segregation. “It’s complicated,” he said, “when you want to take something from somebody.”
He had also been told that the conflict was “complicated,” its history tortuous and contested, and, as he writes, “that a body of knowledge akin to computational mathematics was needed to comprehend it.” He was astonished by the plain truth of what he saw: the walls, checkpoints, and guns that everywhere hemmed in the lives of Palestinians; the clear tiers of citizenship between the first-class Jews and the second-class Palestinians; and the undisguised contempt with which the Israeli state treated the subjugated other.
The most famous of Israel’s foundational claims — that it was a necessary sanctuary for one of the world’s most oppressed peoples, who may not have survived without a state of their own — is at the root of this complication and undergirds the prevailing viewpoint of the political-media-entertainment nexus. It is Israel’s unique logic of existence that has provided a quantum of justice to the Israeli project in the eyes of Americans and others around the world, and it’s what separates Jewish Israelis from the white supremacists of the Jim Crow South, who had no justice on their side at all.
“It’s kind of hard to remember, but even as late as 2014, people were talking about the Civil War as this complicated subject,” Jackson said. “Ta-Nehisi was going to plantations and hanging out at Monticello and looking at all the primary documents and reading a thousand books, and it became clear that the idea of a ‘complicated’ narrative was ridiculous.” The Civil War was, Coates concluded, solely about the South’s desire to perpetuate slavery, and the subsequent attempts over the next century and a half to hide that simple fact betrayed, he believed, a bigger lie — the lie that America was a democracy, a mass delusion that he would later call “the Dream” in Between the World and Me.
The hallmarks of The Atlantic’s coverage include variations of Israel’s seemingly limitless “right to defend itself”; an assertion that extremists on “both sides” make the conflict worse, with its corollary argument that if only Prime Minister Benjamin Netanyahu’s Jewish-supremacist government were ousted, then progress could be made; abundant sympathy for the suffering of Israelis and a comparatively muted response to the suffering of Palestinians; a fixation on the way the issue is debated in America, particularly on college campuses; and regular warnings that antisemitism is on the rise both in America and around the world.
the overall pattern reveals a distorting worldview that pervades the industry and, as Coates writes in The Message, results in “the elevation of factual complexity over self-evident morality.” “The view of mainstream American commentators is a false equivalence between subjugator and subjugated,” said Nathan Thrall, the Jerusalem-based author of the Pulitzer Prize–winning A Day in the Life of Abed Salama, as if the Israelis and the Palestinians were equal parties in an ancient tug-of-war.
For Coates, the problem for the industry at large partly stems from the perennial problem of inadequate representation. “It is extremely rare to see Palestinians and Arabs writing the coverage or doing the book reviews,” he said. “I would be interested if you took the New York Times and the Washington Post and The Wall Street Journal and looked at how many of those correspondents are Palestinian, I wonder what you would find.” (It’s a testament to just how polarizing the issue is that many Jewish Americans believe the bias in news media works the other way around, against Israel.)
American mainstream journalism, Coates says, defers to American authority. “It’s very similar,” he told me, “to how American journalism has been deferential to the cops. We privilege the cops, we privilege the military, we privilege the politicians. The default setting is toward power.”
in the total coverage, in all of the talk of experts and the sound bites of politicians and the dispatches of credentialed reporters, a sense of ambiguity is allowed to prevail. “The fact of the matter is,” he said, “that kid up at Columbia, whatever dumb shit they’re saying, whatever slogan I would not say that they would use, they are more morally correct than some motherfuckers that have won Pulitzer Prizes and National Magazine Awards and are the most decorated and powerful journalists.”
When I asked Coates what he wanted to see happen in Israel and Palestine, he avoided the geopolitical scale and tended toward the more specific — for example, to have journalists not be “shot by army snipers.” He said that the greater question was not properly for him; it belonged to those with lived experience and those who had been studying the problem for years.
On the importance of using moral rightness as a north star for pragmatic designs
“I have a deep-seated fear,” he told me, “that the Black struggle will ultimately, at its root, really just be about narrow Black interest. And I don’t think that is in the tradition of what our most celebrated thinkers have told the world. I don’t think that’s how Martin Luther King thought about the Black struggle. I know that’s not how Du Bois thought about the Black struggle. I know that’s not how Baldwin thought about the Black struggle. Should it turn out that we have our first Black woman president, and our first South Asian president, and we continue to export 2,000-pound bombs to perpetrate a genocide, in defense of a state that is practicing apartheid, I won’t be able to just sit here and shake my head and say, ‘Well, that is unfortunate.’ I’m going to do what I can in the time that remains, and the writing that I have, to not allow that to be, because that is existential death for the Black struggle, and for Black people, as far as I’m concerned.”
·nymag.com·
The Return of Ta-Nehisi Coates
New Apple Stuff and the Regular People
New Apple Stuff and the Regular People
"Will it be different?" is the key question the regular people ask. They don't want there to be extra steps or new procedures. They sure as hell don't want the icons to look different or, God forbid, be moved to a new place.
These bright and capable people who will one day help you through knee replacement surgery all bought a Mac when they were college frehmen and then they never updated it. Almost all of them had the default programs still in the dock. They are regular users. You with all your fancy calendars, note taking apps and your customized terminal are an outlier. Never forget.
The majority of iPhone users and Mac owners have no idea what's coming though. They are going to wake up on Monday to an unwelcome notification that there is an update available. Many of them will ask their techie friends (like you) if there is a way to make the update notification go away. They will want to know if they have to install it.
·louplummer.lol·
New Apple Stuff and the Regular People
complete delegation
complete delegation
Linus shares his evolving perspective on chat interfaces and his experience building a fully autonomous chatbot agent. He argues that learning to trust and delegate to such systems without micromanaging the specifics is key to collaborating with autonomous AI agents in the future.
I've changed my mind quite a bit on the role and importance of chat interfaces. I used to think they were the primitive version of rich, creative, more intuitive interfaces that would come in the future; now I think conversational, anthropomorphic interfaces will coexist with more rich dexterous ones, and the two will both evolve over time to be more intuitive, capable, and powerful.
I kept checking the database manually after each interaction to see it was indeed updating the right records — but after a few hours of using it, I've basically learned to trust it. I ask it to do things, it tells me it did them, and I don't check anymore. Full delegation.
How can I trust it? High task success rate — I interact with it, and observe that it doesn't let me down, over and over again. The price for this degree of delegation is giving up control over exactly how the task is done. It often does things differently from the way I would, but that doesn't matter as long as outputs from the system are useful for me.
·stream.thesephist.com·
complete delegation
101 Additional Advices
101 Additional Advices
Forget trying to decide what your life’s destiny is. That’s too grand. Instead, just figure out what you should do in the next 2 years.
Try to define yourself by what you love and embrace, rather than what you hate and refuse.
Where you live—what city, what country—has more impact on your well being than any other factor. Where you live is one of the few things in your life you can choose and change.
Once a month take a different route home, enter your house by a different door, and sit in a different chair at dinner. No ruts.
Every now and then throw a memorable party. The price will be steep, but long afterwards you will remember the party, whereas you won’t remember how much is in your checking account.
Most arguments are not really about the argument, so most arguments can’t be won by arguing.
invent your own definition of success. Shoot your arrows first and then paint a bull’s eye around where they land. You’re the winner!
There should be at least one thing in your life you enjoy despite being no good at it. This is your play time, which will keep you young. Never apologize for it.
You have 5 minutes to act on a new idea before it disappears from your mind.
The patience you need for big things, is developed by your patience with the little things.
When you are stuck or overwhelmed, focus on the smallest possible thing that moves your project forward.
For steady satisfaction, work on improving your worst days, rather than your best days.
Your decisions will become wiser when you consider these three words: “…and then what?” for each choice.
If possible, every room should be constructed to provide light from two sides.  Rooms with light from only one side are used less often, so when you have a choice, go with light from two sides.
There is a profound difference between thinking less of yourself (not useful), and thinking of yourself less (better).
Always ask yourself: what would change my mind?
Becoming one-of-a-kind is not a solo job. Paradoxically you need everyone else in the world to help make you unique.
If you need emergency help from a bystander, command them what to do. By giving them an assignment, you transform them from bewildered bystander to a responsible assistant.
The most common mistake we make is to do a great job on an unimportant task.
Don’t work for a company you would not invest money in, because when you are working you are investing the most valuable thing you have: your time.
Fail forward. Failing is not a disgrace if you keep failing better.
Do not cling to a mistake just because you spent a lot of time making it.
For small tasks the best way to get ready is to do it immediately.
What others want from you is mostly to be seen. Let others know you see them.
When you try something new, don’t think of it as a matter of success / failure, but as success / learning to succeed.
use your honesty as a gift not as a weapon. Your honesty should benefit others.
A good sign that you are doing the kind of work you should be doing is that you enjoy the tedious parts that other people find tortuous.
Celebrating the success of others costs you nothing, and increases the happiness of everyone, including you.
To tell a good story, you must reveal a surprise; otherwise it is just a report.
a long horizon allows you to compound small advances into quite large achievements.
Often ideas are rejected because of the tone of voice they are wrapped in. Humility covers many blemishes.
When you are right, you are learning nothing.
Very small things accumulate until they define your larger life. Carefully choose your everyday things.
If you are impressed with someone’s work, you should tell them, but even better, tell their boss.
Humility is mostly about being very honest about how much you owe to luck.
·kk.org·
101 Additional Advices
Vision Pro is an over-engineered “devkit” // Hardware bleeds genius & audacity but software story is disheartening // What we got wrong at Oculus that Apple got right // Why Meta could finally have its Android moment
Vision Pro is an over-engineered “devkit” // Hardware bleeds genius & audacity but software story is disheartening // What we got wrong at Oculus that Apple got right // Why Meta could finally have its Android moment
Some of the topics I touch on: Why I believe Vision Pro may be an over-engineered “devkit” The genius & audacity behind some of Apple’s hardware decisions Gaze & pinch is an incredible UI superpower and major industry ah-ha moment Why the Vision Pro software/content story is so dull and unimaginative Why most people won’t use Vision Pro for watching TV/movies Apple’s bet in immersive video is a total game-changer for live sports Why I returned my Vision Pro… and my Top 10 wishlist to reconsider Apple’s VR debut is the best thing that ever happened to Oculus/Meta My unsolicited product advice to Meta for Quest Pro 2 and beyond
Apple really played it safe in the design of this first VR product by over-engineering it. For starters, Vision Pro ships with more sensors than what’s likely necessary to deliver Apple’s intended experience. This is typical in a first-generation product that’s been under development for so many years. It makes Vision Pro start to feel like a devkit.
A sensor party: 6 tracking cameras, 2 passthrough cameras, 2 depth sensors(plus 4 eye-tracking cameras not shown)
it’s easy to understand two particularly important decisions Apple made for the Vision Pro launch: Designing an incredible in-store Vision Pro demo experience, with the primary goal of getting as many people as possible to experience the magic of VR through Apple’s lenses — most of whom have no intention to even consider a $4,000 purchase. The demo is only secondarily focused on actually selling Vision Pro headsets. Launching an iconic woven strap that photographs beautifully even though this strap simply isn’t comfortable enough for the vast majority of head shapes. It’s easy to conclude that this decision paid off because nearly every bit of media coverage (including and especially third-party reviews on YouTube) uses the woven strap despite the fact that it’s less comfortable than the dual loop strap that’s “hidden in the box”.
Apple’s relentless and uncompromising hardware insanity is largely what made it possible for such a high-res display to exist in a VR headset, and it’s clear that this product couldn’t possibly have launched much sooner than 2024 for one simple limiting factor — the maturity of micro-OLED displays plus the existence of power-efficient chipsets that can deliver the heavy compute required to drive this kind of display (i.e. the M2).
·hugo.blog·
Vision Pro is an over-engineered “devkit” // Hardware bleeds genius & audacity but software story is disheartening // What we got wrong at Oculus that Apple got right // Why Meta could finally have its Android moment
Death to the double diamond
Death to the double diamond
The key design skill is less about beautiful all-encompassing Figma documentation with all the kinds of journey maps and personas, or mastering a “process”. It’s about being so keenly situationally aware of what unknowns are in front of you so you can pick out what tools or design activities target them precisely.
It's not helpful to think of design as process of discovering, defining, ideating and delivering (or whatever version of those double diamond steps you prefer) because following those steps often does not get you closer to a solution.
·tangentdesign.substack.com·
Death to the double diamond
In Praise of Idleness, by Bertrand Russell | Harper's Magazine
In Praise of Idleness, by Bertrand Russell | Harper's Magazine
Originally written in 1932! From the Harper's Magazine archives.
I believed all that I was told and acquired a conscience which has kept me working hard down to the present moment. But although my conscience has controlled my actions, my opinions have undergone a revolution. I think that there is far too much work done in the world, that immense harm is caused by the belief that work is virtuous, and that what needs to be preached in modern industrial countries is quite different from what always has been preached.
what a man earns he usually spends, and in spending he gives employment. As long as a man spends his income he puts just as much bread into people’s mouths in spending as he takes out of other people’s mouths in earning. The real villain, from this point of view, is the man who saves. If he merely puts his savings in a stocking, like the proverbial French peasant, it is obvious that they do not give employment. If he invests his savings the matter is less obvious, and different cases arise.
In view of the fact that the bulk of the expenditure of most civilized governments consists in payments for past wars and preparation for future wars, the man who lends his money to a government is in the same position as the bad men in Shakespeare who hire murderers. The net result of the man’s economical habits is to increase the armed forces of the State to which he lends his savings. Obviously it would be better if he spent the money, even if he spent it on drink or gambling.
In these days, however, no one will deny that most enterprises fail. That means that a large amount of human labor, which might have been devoted to producing something which could be enjoyed, was expended on producing machines which, when produced, lay idle and did no good to anyone.
If he spent his money, say, in giving parties for his friends, they (we may hope) would get pleasure, and so would all those on whom he spent money, such as the butcher, the baker, and the bootlegger. But if he spends it (let us say) upon laying down rails for surface cars in some place where surface cars turn out to be not wanted, he has diverted a mass of labor into channels where it gives pleasure to no one
·harpers.org·
In Praise of Idleness, by Bertrand Russell | Harper's Magazine
Love letter to the liberal arts · Molly Mielke
Love letter to the liberal arts · Molly Mielke
You’re likely seen as a bright brain with a knack for solving problems, and what good are the liberal arts¹ and the humanities² in solving the world’s large, technically complex issues? You want your work to have impact and “matter” — something you know to require hard work, discipline, and things like “frameworks” and “mental models.” Tactical, practical, and efficient. But consider, for a second, your thinking. Where did your thoughts and beliefs come from? What about your conviction, your mission, your sense of purpose on this earth? These questions are why the liberal arts and the humanities, or subjects distinct from professional and technical subjects, exist.
The ecosystem that we inhabit as technologists was not built with humans in mind, it was built to run laps around other industries within the capitalist game, and it does this on the backs of the young people it exploits. In simpler terms: the status quo of technology was not designed to make you a happy, content, morally well-rounded young person. That, however, is precisely the purpose of examining the world through a liberal arts lens. Through this frame of view, we might think thoughts without action items, try opinions on for size, celebrate contradiction, and revel in the pursuit of understanding both each other and the world around us.
At their core, the liberal arts and the humanities serve as aggregated documentation on the human condition — the kind of documentation that is meant to be digested, discussed with others, and revisited from whichever angles serve you best along your journey.
It is difficult to advocate for the liberal arts by appealing to results or metrics. But our bias towards viewing these non-instrumental disciplines through a problem-solving lens is exactly why we need spaces that suggest other ways of seeing the world. While that lens grants impressive achievement, it might also leave you wondering why you were even chasing after the thing you achieved to begin with. Our goal is to show you that the problem-solving lens is one of many possible views you can have on the world. You can treat this view like a pair of glasses, one that ought to be regularly removed and replaced with a more reflective, contemplative, and critical lens.
·mollymielke.com·
Love letter to the liberal arts · Molly Mielke
r/threebodyproblem - Currently reading the first book, question for fans
r/threebodyproblem - Currently reading the first book, question for fans
It’s criticised a lot for lacking character depth and not focusing on the characters. I’d agree somewhat but there are a few characters and one in particular which I felt a real connection with as the world unfolds in the later books.What it lacks in character charm though it makes up for in mind bending sci fi. Scale. The possibilities that could lay ahead. It focuses on mass psychology, how civilisations react, different ages etc. It’s about a much much much bigger picture and almost sacrifices character development to focus on that other stuff.I wouldn’t change a thing despite finding book one quite difficult too.My appreciation for it warped beyond recognition as I made my way through books 2 and 3.
Chinese is a utilitarian language, yes. It's also a language that heavily relies on context for impact and meaning. Cixin Liu's writing is no exception and is similar throughout his entire bibliography. It's very recognizable when he was truly inspired, e.g. the hairs on Ye Wenjie's cheeks standing up when she first stepped on Radar Peak, the making of the first sophon, etc. These moments increase in number progressively through the series, and Death's End is mostly one super inspired moment he obviously dreamed about writing for a long time after another, to the point where the final chapters are a mind-boggling rush through new concepts, eons and the coming together of numerous old concepts and plot threads. In short, this was written by a practicing engineer.
·reddit.com·
r/threebodyproblem - Currently reading the first book, question for fans
Privacy Fundamentalism
Privacy Fundamentalism
my critique of Manjoo’s article specifically and the ongoing privacy hysteria broadly is not simply about definitions or philosophy. It’s about fundamental assumptions. The default state of the Internet is the endless propagation and collection of data: you have to do work to not collect data on one hand, or leave a data trail on the other. This is the exact opposite of how things work in the physical world: there data collection is an explicit positive action, and anonymity the default.
I believe the privacy debate needs to be reset around these three assumptions: Accept that privacy online entails trade-offs; the corollary is that an absolutist approach to privacy is a surefire way to get policy wrong. Keep in mind that the widespread creation and spread of data is inherent to computers and the Internet, and that these qualities have positive as well as negative implications; be wary of what good ideas and positive outcomes are extinguished in the pursuit to stomp out the negative ones. Focus policy on the physical and digital divide. Our behavior online is one thing: we both benefit from the spread of data and should in turn be more wary of those implications. Making what is offline online is quite another.
·stratechery.com·
Privacy Fundamentalism