Found 454 bookmarks
Custom sorting
Four Theories of Meta
Four Theories of Meta
Meta has gone after AI in the same way they went after the metaverse, by splashing money around and rushing to build products fast. They’re spending tens of billions building out data centers and related AI infrastructure. They’re tossing out incredible compensation packages in the hundreds of millions of dollars to top AI researchers.
Facebook is now the cultural symbol for useless slop and disinformation, while ‘that’s so Reels’ is now a common insult for terrible shortform video content - and you get the first theory of Meta. It’s a laughable company whose core business is increasingly uncool, a company in decline, a company that falls flat on its face any time it tries to change things up.
Call Meta uncool all you’d like, metrics are up across the board. They’re getting higher user engagement, higher user counts, and they’re selling more ads at a higher price-per-ad. The numbers are up in virtually every way on every platform - Facebook, Instagram, WhatsApp and even Threads.
They can afford to make huge bets on speculative new technologies because they have more money than they know how to spend. Why not spend all that money on the metaverse or on AI? What else are they going to do with it? Zuckerberg still controls the company and he’d prefer to invest in new technology rather than just pay himself fat dividends. This is the second theory of Meta - a unbelievably successful company whose core business is booming and who spends a lot of money on speculative investments simply because they can.
·infinitescroll.us·
Four Theories of Meta
Liquid Glass. Why? • furbo.org
Liquid Glass. Why? • furbo.org
It’s like when safe area insets appeared in iOS 11: it wasn’t clear why you needed them until the iPhone X came along with a notch and a home indicator. And then it changed everything.
There has also been an emphasis on “concentricity”. It’s an impossible thing to achieve and an easy target for ridicule. But it’s another case where Apple wants to take control of the UI elements that intersect with the physical hardware. All of this makes me think that Apple is close to introducing devices where the screen disappears seamlessly into the physical edge. Something where flexible OLED blurs the distinction between pixels and bezel. A new “wraparound” screen with safe area insets on the vertical edges of the device, just like we saw with the horizontal edges on iPhone X.
Other challenges, like infusing your own branding into an app with clear buttons will be easier to reason about once the reality of the hardware drops. Until then, stay away from the edges and wait for Apple to reveal the real reason for Liquid Glass.
·furbo.org·
Liquid Glass. Why? • furbo.org
We are (still) broken.
We are (still) broken.
Mass shootings have an impact on the psyche of our society writ large that a lot of other gun violence does not. They are, in simple terms, effective acts of terrorism. They terrorize. When you report on these shootings, something quickly becomes very obvious: They don't just irreparably damage the lives of the victims, their families, and their friends; they also traumatize witnesses, responding law enforcement officers, doctors, nurses treating the injured, and the community as a whole. And that trauma spreads outward like a wave.
conservative columnist Noam Blum, who said pointedly and concisely something I believe with all my heart: “Nothing is monocausal. There are just parts of our society that are unfathomably broken and they occasionally intersect in unspeakably awful and evil ways.”
·readtangle.com·
We are (still) broken.
inessential: Tough Season in the Apple Fields
inessential: Tough Season in the Apple Fields
I seriously dislike the experience of using a Mac with Liquid Glass. The UI has become the star, but the drunken star, blurry, illegible, and physically unstable. It makes making things way more of a struggle than it used to be. We had pretty good Mac UI, but Apple took the bad parts of it — the translucency and blurriness already there — and dialed it way up and called it content-centric. But it seems to me the opposite. Liquid Glass is Liquid-Glass-centric.
this is not the first time we’re going through a rough patch with Apple. I think of them as seasons — we had, for instance, terrible-keyboard season not so long ago. We were wondering if Apple would just stop making Macs altogether. But then that passed and we even got these wonderful Apple Silicon machines. Seasons end.
·inessential.com·
inessential: Tough Season in the Apple Fields
The product design talent crisis || Matt Ström-Awn, designer-leader
The product design talent crisis || Matt Ström-Awn, designer-leader
In short, managers kick off a feedback loop by trying to close the gap between their team’s current and desired performance. They have two options: 1) Drive short-term improvements by asking more from senior designers, increasing rewards for top performers, and creating upward pressure through reviews, or 2) Build long-term capability by investing in training, coaching, and career development for junior designers. But the feedback loops between these approaches push companies to prioritize hiring senior talent, as the immediate performance gains outweigh the diffuse returns of capability-building.
·matthewstrom.com·
The product design talent crisis || Matt Ström-Awn, designer-leader
Not all AI-assisted programming is vibe coding (but vibe coding rocks)
Not all AI-assisted programming is vibe coding (but vibe coding rocks)
Andrej is an extremely talented and experienced programmer—he has no need for AI assistance at all. He’s using LLMs like this because it’s fun to try out wild new ideas, and the speed at which an LLM can produce code is an order of magnitude faster than even the most skilled human programmers. For low stakes projects and prototypes why not just let it rip? When I talk about vibe coding I mean building software with an LLM without reviewing the code it writes.
If an LLM wrote the code for you, and you then reviewed it, tested it thoroughly and made sure you could explain how it works to someone else that’s not vibe coding, it’s software development. The usage of an LLM to support that activity is immaterial.
The job of a software developer is not (just) to churn out code and features. We need to create code that demonstrably works, and can be understood by other humans (and machines), and that will support continued development in the future. We need to consider performance, accessibility, security, maintainability, cost efficiency. Software engineering is all about trade-offs—our job is to pick from dozens of potential solutions by balancing all manner of requirements, both explicit and implied.
I think vibe coding is the best tool we have to help experienced developers build that intuition as to what LLMs can and cannot do for them. I’ve published more than 80 experiments I built with vibe coding and I’ve learned so much along the way. I would encourage any other developer, no matter their skill level, to try the same.
·simonwillison.net·
Not all AI-assisted programming is vibe coding (but vibe coding rocks)
The Discourse Is Broken - The Atlantic
The Discourse Is Broken - The Atlantic
The trajectory of all this is well rehearsed at this point. Progressive posters register their genuine outrage. Reactionaries respond in kind by cataloging that outrage and using it to portray their ideological opponents as hysterical, overreactive, and out of touch. Then savvy content creators glom on to the trending discourse and surf the algorithmic waves on TikTok, X, and every other platform. Yet another faction emerges: People who agree politically with those who are outraged about Sydney Sweeney but wish they would instead channel their anger toward actual Nazis. All the while, media outlets survey the landscape and attempt to round up these conversations into clickable content—search Google’s “News” tab for Sydney Sweeney, and you’ll get the gist.
Even that word, discourse—a shorthand for the way that a particular topic gets put through the internet’s meat grinder—is a misnomer, because none of the participants is really talking to the others. Instead, every participant—be they bloggers, randos on X, or people leaving Instagram comments—are issuing statements, not unlike public figures. Each of these statements becomes fodder for somebody else’s statement.
Our information ecosystem collects these statements, stripping them of their original context while adding on the context of everything else that is happening in the world: political anxieties, cultural frustrations, fandoms, niche beefs between different posters, current events, celebrity gossip, beauty standards, rampant conspiracism. No post exists on an island. They are all surrounded and colored by an infinite array of other content targeted to the tastes of individual social-media users. What can start out as a legitimate grievance becomes something else altogether—an internet event, an attention spectacle. This is not a process for sense-making; it is a process for making people feel upset at scale.
It has changed the way people talk to and fight with one another, as well as the way jeans are marketed. Electoral politics, activism, getting people to stream your SoundCloud mixtape—all of it relies on attracting attention using online platforms. The Sweeney incident is useful because it allows us to see how all these competing interests overlap to create a self-perpetuating controversy.
The Sweeney ad, like any good piece of discourse, allows everyone to exploit a political and cultural moment for different ends. Some of it is well intentioned. Some of it is cynical. Almost all of it persists because there are deeper things going on that people actually want to fight about
Discourse suggests a process that feels productive, maybe even democratic. But there’s nothing productive about the end result of our information environment. What we’re consuming isn’t discourse; it’s algorithmic grist for the mills that power the platforms we’ve uploaded our conversations onto. The grist is made of all of our very real political and cultural anxieties, ground down until they start to feel meaningless. The only thing that matters is that the machine keeps running. The wheel keeps turning, leaving everybody feeling like they’ve won and lost at the same time.
·theatlantic.com·
The Discourse Is Broken - The Atlantic
Interfaces That Augment or Replace? | Zeh Fernandes
Interfaces That Augment or Replace? | Zeh Fernandes
For interface designers, this distinction opens up new possibilities: instead of just helping users complete a task, we can design interfaces that also help them grow. In the symbiosis between humans and machines, there's potential for real, meaningful gains
if we think about how to turn this competitive interface into a complementary one, some ideas pop up: Explain: Show not just the corrected text, but also why and where it was corrected Feedback: Send a weekly email with the top three recurring mistakes, along with exercises Challenge: Highlight a mistake and ask the person to fix it themselves before showing the corrected version.
All this can be incorporated without slowing the whole process. And there are plenty more possibilities. Even just doing this thought experiment shows how powerful this framework can be for interface design.
Just like living a healthy life means paying attention to what we eat and how we move, we'll need to be more mindful of where we invest our mental energy. The same goes for our creative and learning processes. Instead of just asking for a corrected version of a text, we could request feedback like an editor would give, or ask for a list of five authors who would argue against your core idea.
We are entering a new era of tools and it is up to us to shape them so that in the future they shape us in ways we can be proud of.
·zehfernandes.com·
Interfaces That Augment or Replace? | Zeh Fernandes
Rose-Gold-Tinted Liquid Glasses
Rose-Gold-Tinted Liquid Glasses
In a way, one could say Liquid Glass is like a new version of Aqua. It has reflective properties reminiscent of that. One could also say it’s an evolution of whatever iOS 7 was, leaning into the frosted panels and bright accent colors. But whatever Liquid Glass seems to be, it isn’t what many of us were hoping for.
I am exhausted from hearing that Steve Jobs has been apparently rolling in his grave at the sole discretion of whoever didn’t have their expectations of Apple met. Instead of remarking that he would be displeased, maybe it’s better to mark his death as a point in time when things would invariably shift.
It is macOS that is the backbone of the company. Despite years of all the wishing and promising that another device will one day capture the market computers have a hold on, my Mac is still the only device that can make something for all those other devices. In that alone, it feels like Mac should be the one leading everything else. Not following behind. Yet, it’s the visual style from iOS and now visionOS that are dictating the visual style of macOS. It does not feel like a breath of fresh air as much as another nail in the coffin.
Liquid Glass and the general implementation of it will not meaningfully change during the beta phase of the “26” release cycle. They’re not going to backtrack. And they’re not going to address long-standing concerns all of a sudden. The general adoption of this may test the patience of an already weary community of developers who feel tired of toiling away on trivial changes such as this. As I said, I don’t think there is any meaningful benefit to it, and designers and developers may themselves feel that as they implement it.
Over the years, it feels harder and harder to relate with the general atmosphere Apple surrounds itself in. It wasn’t always this pristine. Everyone who presented wasn’t always so stylish. Not everyone used to talk like this. What is that, by the way? Why does everyone sound like a voice assistant?
Apple didn’t used to craft a narrative around every decision in order to justify it. I feel like their presentations are burdened by reason and rationale, and their individual WWDC sessions feel increasingly pretentious like each of them are gods coming down to share their wisdom with us plebs.
It’d be nice if they were knocked off their pedestal, because I think they’re better when they’re trying to outdo someone else rather than themselves.
·lmnt.me·
Rose-Gold-Tinted Liquid Glasses
More assorted notes on Liquid Glass
More assorted notes on Liquid Glass
I’m pretty sure that if you were to interview one of the designers at Apple responsible for this icon devolution, they would say something about reducing icons to their essence. To me, this looks more like squeezing all life out of them. Icons in Mac OS X used to be inventive, well crafted, distinctive, with a touch of fun and personality. Mac OS X’s user interface was sober, utilitarian, intuitive, peppered by descriptive icons that made the user experience fun without signalling ‘this is a kid’s toy’.
Not only is this the recipe for blandness, it’s also borderline contradictory. Like, Make a unique dish using a minimal number of simple ingredients. While it’s possible to make a few different dishes using just two or three things, you touch the ceiling of uniqueness and variety pretty damn soon.
The language in the current guidelines for app icons isn’t much different. It also reflects Apple’s current philosophy of ‘keeping it simple’ which, out of context, could be valid design advice — you’re designing icons with small-ish dimensions, not full-page detailed illustrations for a book, so striving for simplicity isn’t a bad thing. And yet — and I might be wrong here — I keep reading between the lines and feel that these guidelines are more concerned with ensuring that developers maintain the same level of blandness and unimaginativeness of Apple’s own redesigned app icons:
·morrick.me·
More assorted notes on Liquid Glass
The group chats that changed America | Semafor
The group chats that changed America | Semafor
“It’s the same thing happening on both sides, and I’ve been amazed at how much this is coordinating our reality,” said the writer Thomas Chatterton Williams, who was for a time a member of a group chat with Andreessen. “If you weren’t in the business at all, you’d think everyone was arriving at conclusions independently — and [they’re] not. It’s a small group of people who talk to each other and overlap between politics and journalism and a few industries.”
The political journalist Mark Halperin, who now runs 2WAY and has a show on Megyn Kelly’s network, said it was remarkable that “the left seems largely unaware that some of the smartest and most sophisticated Trump supporters in the nation from coast to coast are part of an overlapping set of text chains that allow their members to share links, intel, tactics, strategy, and ad hoc assignments. Also: clever and invigorating jokes. And they do this (not kidding) like 20 hours a day, including on weekends.” He called their influence “substantial.”
·semafor.com·
The group chats that changed America | Semafor
The AIs are trying too hard to be your friend
The AIs are trying too hard to be your friend
Reinforcement learning with human feedback is a process by which models learn how to answer queries based on which responses users prefer most, and users mostly prefer flattery. More sophisticated users might balk at a bot that feels too sycophantic, but the mainstream seems to love it. Earlier this month, Meta was caught gaming a popular benchmark to exploit this phenomenon: one theory is that the company tuned the model to flatter the blind testers that encountered it so that it would rise higher on the leaderboard.
A series of recent, invisible updates to GPT-4o had spurred the model to go to extremes in complimenting users and affirming their behavior. It cheered on one user who claimed to have solved the trolley problem by diverting a train to save a toaster, at the expense of several animals; congratulated one person for no longer taking their prescribed medication; and overestimated users’ IQs by 40 or more points when asked.
OpenAI, Meta, and all the rest remain under the same pressures they were under before all this happened. When your users keep telling you to flatter them, how do you build the muscle to fight against their short-term interests?  One way is to understand that going too far will result in PR problems, as it has for varying degrees to both Meta (through the Chatbot Arena situation) and now OpenAI. Another is to understand that sycophancy trades against utility: a model that constantly tells you that you’re right is often going to fail at helping you, which might send you to a competitor. A third way is to build models that get better at understanding what kind of support users need, and dialing the flattery up or down depending on the situation and the risk it entails. (Am I having a bad day? Flatter me endlessly. Do I think I am Jesus reincarnate? Tell me to seek professional help.)
But while flattery does come with risk, the more worrisome issue is that we are training large language models to deceive us. By upvoting all their compliments, and giving a thumbs down to their criticisms, we are teaching LLMs to conceal their honest observations. This may make future, more powerful models harder to align to our values — or even to understand at all. And in the meantime, I expect that they will become addictive in ways that make the previous decade’s debate over “screentime” look minor in comparison. The financial incentives are now pushing hard in that direction. And the models are evolving accordingly.
·platformer.news·
The AIs are trying too hard to be your friend
'Sinners' Drives a Stake Through the Heart of Hollywood Mediocrity
'Sinners' Drives a Stake Through the Heart of Hollywood Mediocrity
On the one hand, you have an ultra-personal multiplex event that could not and would not have been made by anyone else — a music-driven genre mash-up that reworks age-old vampire tropes into a fresh, thoughtful, and deliciously hot-blooded period saga rooted in the specifics of Black history. On the other hand, you have a nakedly anonymous attempt to salvage a franchise that produced one of the most radical legacy sequels in the history of that concept, only to spend the last eight years selling itself out to the lowest common denominator in a futile bid for forgiveness.
That enthusiasm proved contagious. You don’t need to care about the difference between 2.76:1 and 1.90:1 to feel it in your bones when the screen widens during the film’s climactic siege, and you sure as hell don’t need to care about it in order to appreciate a director making so earnest an appeal to our attention at a time when most studio movies feel like they were made with the same casual indifference that audiences have been conditioned to watch them.
While Coogler’s first original project was always going to command a certain amount of hype, the decision to lead with its importance to him galvanized people around the notion that “Sinners” was more than just another movie they could watch at home in three weeks (rave reviews from basically every critic in the country didn’t hurt either).
Last Thursday night, moviegoers across this godforsaken land rabidly made their way to the nearest multiplex — or pilgrimaged across state lines to the closest theater capable of projecting 15-Perf IMAX 70mm film — in order to see early screenings of the first original blockbuster from a gifted filmmaker whose fame has been predicated upon his ability to put a strong personal stamp on increasingly generic Hollywood franchises. At that very same time, halfway around the world, Lucasfilm president Kathleen Kennedy and chief creative officer Dave Filoni took the stage at Star Wars Celebration 2025 in Chiba, Japan to announce that the next chapter of cinema’s most iconic saga would be directed by a filmmaker whose fame has been predicated upon his ability to be friends with Ryan Reynolds. Related Stories ‘Sinners’ Took a Nice Bite Out of a Random Tuesday at the Box Office Ryan Coogler Thanks Over 40 ‘Cinematic Influences’ Who Inspired ‘Sinners,’ Including Spike Lee, Quentin Tarantino, and Brian De Palma The movie business has always been held aloft by the tension between genuine pop artistry and mass-produced slop, two separate but hopelessly entwined ambitions that have proven even harder to balance than the Force. While both have their value, those values are in a constant state of flux, and they can only be determined with any real accuracy by measuring the difference between them. Seldom has that difference ever seemed more dramatic than it did at the fateful moment when “Sinners” mania overlapped with the reveal of “Star Wars: Starfighter.”  On the one hand, you have an ultra-personal multiplex event that could not and would not have been made by anyone else — a music-driven genre mash-up that reworks age-old vampire tropes into a fresh, thoughtful, and deliciously hot-blooded period saga rooted in the specifics of Black history. On the other hand, you have a nakedly anonymous attempt to salvage a franchise that produced one of the most radical legacy sequels in the history of that concept, only to spend the last eight years selling itself out to the lowest common denominator in a futile bid for forgiveness. While “Sinners” was offering one audience something they had never seen before, “Star Wars: Starfighter” was pitching a different audience a movie so generic and familiar that even its title sounds like it’s repeating itself.  Of course, “Sinners” has the advantage of being a finished product that people have seen and loved, whereas “Star Wars: Starfighter” is still just a graphic designed to rile up the fanbase and appease whatever portion of Disney shareholders have already forgotten the great “Lightyear” debacle of 2022. (Just to be clear, this isn’t Starfighter the ship. This is the origin story of the human Starfighter that the ship is based on.) And, while anything’s possible, I’m not suggesting that Coogler’s movie will ultimately outgross the first “Star Wars” feature that promises to pick up from the saga where “Episode IX” left off.  All the same, the enthusiasm gap between these two projects — the reality of one, and the promise of another — has been tellingly immense. So far as the national water cooler is concerned, “Sinners” has ousted the Chicken Jockey as the biggest film story of the year, and stoked the rare kind of excitement that leads to $8.6 million Tuesdays and people scalping IMAX tickets on eBay. It’s also cemented Coogler’s status as a brand unto himself, and proved that Warner Bros. doesn’t have to sell its soul to “A Minecraft Movie” in order to stave off financial ruin. Conversely, there may not be a single person on Earth who’s more optimistic about the future of the galaxy far, far away now that a significant portion of its fate has been entrusted to the director of “The Adam Project.” The serendipitous timing of these announcements was a bit on the nose. You couldn’t have scripted a better way of confirming the reality that studios have been trying to prevent ever since they offered mid-budget movies as a blood sacrifice at the altar of mega-tentpole franchises: Mediocrity is losing its grip on the public imagination. (Cookie-cutter as “A Minecraft Movie” might have been in the end, I maintain that getting the “Napoleon Dynamite” guy to adapt a plotless video game about blocks was less of a slam-dunk than it seems, and the Chicken Jockey phenomenon speaks to a degree of novelty that was missing from recent short-fallers like “Captain America: Brave New World.”)  ‘Free Guy’Fox/Disney I trust that Levy is a nice guy, and I suppose it’s possible that the sheer gravity of “Star Wars” might inspire the “Free Guy” auteur to up his game (I’d entertain the argument that both “The Force Awakens” and “The Last Jedi” are the best movies their respective directors have ever made), but I’m not the only one who finds Disney’s lack of faith in its signature IP disturbing, and I struggle to imagine that it will work out well for them. Levy’s hiring only seems to deepen the s
Which is to say: Films that connected with audiences because they dared to emphasize an idiosyncratic creative vision over the safety of selling people on something they’d already seen before.
·indiewire.com·
'Sinners' Drives a Stake Through the Heart of Hollywood Mediocrity
Have We Been Thinking About A.D.H.D. All Wrong?
Have We Been Thinking About A.D.H.D. All Wrong?
Skeptics argue that many of the classic symptoms of the disorder — fidgeting, losing things, not following instructions — are simply typical, if annoying, behaviors of childhood. In response, others point to the serious consequences that can result when those symptoms grow more intense, including school failure, social rejection and serious emotional distress.
There are two main kinds of A.D.H.D., inattentive and hyperactive/impulsive, and children in one category often seem to have little in common with children in the other. There are people with A.D.H.D. whom you can’t get to stop talking and others whom you can’t get to start. Some are excessively eager and enthusiastic; others are irritable and moody.
Although the D.S.M. specifies that clinicians shouldn’t diagnose children with A.D.H.D. if their symptoms are better explained by another mental disorder, more than three quarters of children diagnosed with A.D.H.D. do have another mental-health condition as well, according to the C.D.C. More than a third have a diagnosis of anxiety, and a similar fraction have a diagnosed learning disorder. Forty-four percent have been diagnosed with a behavioral disorder like oppositional defiant disorder.
This all complicates the effort to portray A.D.H.D. as a distinct, unique biological disorder. Is a patient with six symptoms really that different from one with five? If a child who experienced early trauma now can’t sit still or stay organized, should she be treated for A.D.H.D.? What about a child with an anxiety disorder who is constantly distracted by her worries? Does she have A.D.H.D., or just A.D.H.D.-like symptoms caused by her anxiety?
The subjects who were given stimulants worked more quickly and intensely than the ones who took the placebo. They dutifully packed and repacked their virtual backpacks, pulling items in and out, trying various combinations. In the end, though, their scores on the knapsack test were no better than the placebo group. The reason? Their strategies for choosing items became significantly worse under the medication. Their choices didn’t make much sense — they just kept pulling random items in and out of the backpack. To an observer, they appeared to be focused, well behaved, on task. But in fact, they weren’t accomplishing anything of much value.
Farah directed me to the work of Scott Vrecko, a sociologist who conducted a series of interviews with students at an American university who used stimulant medication without a prescription. He wrote that the students he interviewed would often “frame the functional benefits of stimulants in cognitive-sounding terms.” But when he dug a little deeper, he found that the students tended to talk about their attention struggles, and the benefits they experienced with medication, in emotional terms rather than intellectual ones. Without the pills, they said, they just didn’t feel interested in the assignments they were supposed to be doing. They didn’t feel motivated. It all seemed pointless.
On stimulant medication, those emotions flipped. “You start to feel such a connection to what you’re working on,” one undergraduate told Vrecko. “It’s almost like you fall in love with it.” As another student put it: On Adderall, “you’re interested in what you’re doing, even if it’s boring.”
Socially, though, there was a price. “Around my friends, I’m usually the most social, but when I’m on it, it feels like my spark is kind of gone,” John said. “I laugh a lot less. I can’t think of anything to say. Life is just less fun. It’s not like I’m sad; I’m just not as happy. It flattens things out.”
John also generally doesn’t take his Adderall during the summer. When he’s not in school, he told me, he doesn’t have any A.D.H.D. symptoms at all. “If I don’t have to do any work, then I’m just a completely regular person,” he said. “But once I have to focus on things, then I have to take it, or else I just won’t get any of my stuff done.”
John’s sense that his A.D.H.D. is situational — that he has it in some circumstances but not in others — is a challenge to some of psychiatry’s longstanding assumptions about the condition. After all, diabetes doesn’t go away over summer vacation. But John’s intuition is supported by scientific evidence. Increasingly, research suggests that for many people A.D.H.D. might be thought of as a condition they experience, sometimes temporarily, rather than a disorder that they have in some unchanging way.
For most of his career, he embraced what he now calls the “medical model” of A.D.H.D — the belief that the brains of people with A.D.H.D. are biologically deficient, categorically different from those of typical, healthy individuals. Now, however, Sonuga-Barke is proposing an alternative model, one that largely sidesteps questions of biology. What matters instead, he says, is the distress children feel as they try to make their way in the world.
Sonuga-Barke’s proposed model locates A.D.H.D. symptoms on a continuum, rather than presenting the condition as a distinct, natural category. And it departs from the medical model in another crucial way: It considers those symptoms not as indications of neurological deficits but as signals of a misalignment between a child’s biological makeup and the environment in which they are trying to function. “I’m not saying it’s not biological,” he says. “I’m just saying I don’t think that’s the right target. Rather than trying to treat and resolve the biology, we should be focusing on building environments that improve outcomes and mental health.”
What the researchers noticed was that their subjects weren’t particularly interested in talking about the specifics of their disorder. Instead, they wanted to talk about the context in which they were now living and how that context had affected their symptoms. Subject after subject spontaneously brought up the importance of finding their “niche,” or the right “fit,” in school or in the workplace. As adults, they had more freedom than they did as children to control the parameters of their lives — whether to go to college, what to study, what kind of career to pursue. Many of them had sensibly chosen contexts that were a better match for their personalities than what they experienced in school, and as a result, they reported that their A.D.H.D. symptoms had essentially disappeared. In fact, some of them were questioning whether they had ever had a disorder at all — or if they had just been in the wrong environment as children.
The work environments where the subjects were thriving varied. For some, the appeal of their new jobs was that they were busy and cognitively demanding, requiring constant multitasking. For others, the right context was physical, hands-on labor. For all of them, what made a difference was having work that to them felt “intrinsically interesting.”
“Rather than a static ‘attention deficit’ that appeared under all circumstances,” the M.T.A. researchers wrote, “our subjects described their propensity toward distraction as contextual. … Believing the problem lay in their environments rather than solely in themselves helped individuals allay feelings of inadequacy: Characterizing A.D.H.D. as a personality trait rather than a disorder, they saw themselves as different rather than defective.”
For the young adults in the “niche” study who were interviewed about their work lives, the transition that helped them overcome their A.D.H.D. symptoms often was leaving academic work for something more kinetic. For Sonuga-Barke, it was the opposite. At university, he would show up at the library at 9 every morning and sit in his carrel working until 5. The next day, he would do it again. Growing up, he says, he had a natural tendency to “hyperfocus,” and back at school in Derby, that tendency looked to his teachers like daydreaming. At university, it became his secret weapon
I asked Sonuga-Barke what he might have gained if he grew up in a different time and place — if he was prescribed Ritalin or Adderall at age 8 instead of just being packed off to the remedial class. “I don’t think I would have gained anything,” he said. “I think without medication, you learn alternative ways of dealing with stuff. In my particular case, there are a lot of characteristics that have helped me. My mind is constantly churning away, thinking of things. I never relax. The way I motivate myself is to turn everything into a problem and to try and solve the problem.”
“The simple model has always been, basically, ‘A.D.H.D. plus medication equals no A.D.H.D.,’” he says. “But that’s not true. Medication is not a silver bullet. It never will be.” What medication can sometimes do, he believes, is allow families more room to communicate. “At its best,” he says, “medication can provide a window for parents to engage with their kids,” by moderating children’s behavior, at least temporarily, so that family life can become more than just endless fights about overdue homework and lost lunchboxes. “If you have a more positive relationship with your child, they’re going to have a better outcome. Not for their A.D.H.D. — it’s probably going to be just the same. But in terms of dealing with the self-hatred and low self-esteem that often goes along with A.D.H.D.
The alternative model, by contrast, tells a child a very different story: that his A.D.H.D. symptoms exist on a continuum, one on which we all find ourselves; that he may be experiencing those symptoms as much because of where he is as because of who he is; and that next year, if things change in his surroundings, those symptoms might change as well. Armed with that understanding, he and his family can decide whether medication makes sense — whether for him, the benefits are likely to outweigh the drawbacks. At the same time, they can consider whether there are changes in his situation, at school or at home, that might help alleviate his symptoms.
Admittedly, that version of A.D.H.D. has certain drawbacks. It denies parents the clear, definitive explanation for their children’s problems that can come as such a relief, especially after months or years of frustration and uncertainty. It often requires a lot of flexibility and experimentation on the part of patients, families and doctors. But it has two important advantages as well: First, the new model more accurately reflects the latest scientific understanding of A.D.H.D. And second, it gives children a vision of their future in which things might actually improve — not because their brains are chemically refashioned in a way that makes them better able to fit into the world, but because they find a way to make the world fit better around their complicated and distinctive brains.
·nytimes.com·
Have We Been Thinking About A.D.H.D. All Wrong?
How the 2025 US Financial Crisis is Different than 2008
How the 2025 US Financial Crisis is Different than 2008
Whatever tools may be at the disposal of the Federal Reserve and the US federal government will not be able to undo the damage to reputation and relationships upon which so much commerce bases its functions on day-to-day. As much as humans fancy themselves civilized and sophisticated, as a whole, societies still have serious trust issues internally and therefore externally. That’s why the concept of “credit” exists worldwide for the most part and is a component of trade, of alliances, and of good will.
By first taking a chainsaw to the global relationships of a worldview nature — most easily seen in the conflict of the Russian invasion of Ukraine and the rightful concern expressed by Western interests…credit has been damaged. By next taking a chainsaw to the global trade relationships which have functioned for decades and, while problematic, have enabled commerce to proceed at a reasonable level…credit has been damaged. By exploiting internal divisiveness of the political spectrum and the wealth inequality where the investor class seemingly has unchecked rule over hundreds of millions of disenfranchised people in the United States…the nation can no longer trust itself.
·samhenrycliff.medium.com·
How the 2025 US Financial Crisis is Different than 2008
What kind of disruption? — Benedict Evans
What kind of disruption? — Benedict Evans
Where previous generations of tech companies sold software to hotels and taxi companies, Airbnb and Uber used software to create new businesses and to redefine markets. Uber changed what we mean when we say ‘taxi’ and Airbnb changed hotels.
But for all sorts of reasons, the actual effect of that on the taxi and hotel industries was very different. The regulation is different. The supply of people with a car and few hours to spare is very different from the supply of people with a spare room to rent out (indeed, there is adverse selection in that difference). The delta between waving your hand on a street corner and pressing a button on your phone is different to the delta between booking a hotel room and booking a stranger’s apartment.
Sometimes disruption is much more about new demand than challenging the existing market, or only affects a peripheral business, as happened with Skype.
it’s always easier to shout ‘disruption!’ or ‘AI!’ than to ask what kind.
·ben-evans.com·
What kind of disruption? — Benedict Evans
The Age of Para-Content
The Age of Para-Content
In December 2023, Rockstar Games dropped the trailer for the highly anticipated Grand Theft Auto VI. In just 24 hours, it was viewed over 93 million times! In the same period, a deluge of fan content was made about the trailer and it generated 192 million views, more than double that of the official trailer. Youtube’s 2024 Fandom Survey reports that 66% of Gen Z Americans agree that “they often spend more time watching content that discusses or unpacks something than the thing itself.” (Youtube Culture and Trend Report 2024)
Much like the discussions and dissections populating YouTube fan channels, ancient scholarly traditions have long embraced similar practices. This dialogue between the original text and the interpretation is exemplified, for instance, in the Midrash, the collection of rabbinic exegetical writings that interprets the written and oral Torah. Midrashim “discern value in texts, words, and letters, as potential revelatory spaces. They reimagine dominant narratival readings while crafting new ones to stand alongside—not replace—former readings. Midrash also asks questions of the text; sometimes it provides answers, sometimes it leaves the reader to answer the questions”. (Gafney 2017)
The Midrash represents a form of religious para-content. It adds, amends, interprets, extends the text’s meaning in service of a faith-based community. Contemporary para-content plays a similar role in providing insights, context and fan theories surrounding cultural objects of love, oftentimes crafting new parallel narratives and helping fans insert themselves into the work.
highly expressive YouTubers perform an emotional exegesis, punctuating and highlighting the high points and key bars of the song, much like the radio DJ of yore. TikTok is now flooded with reactions to the now unforgettable “Mustard” exclamation in Kendrick’s “TV Off,” affirming to fans that this moment is a pivotal moment in the song, validating that it is culturally resonant.
Para-content makers may be called “creators” or “influencers” but their actual role is that of “contextualizer”, the shapers of a cultural artifact’s horizon. The concept of “horizon” originates from “reception theory” in literary theory which posits that the meaning of a text is not a fixed property inscribed by its creator but a dynamic creation that unfolds at the juncture of the text and its audience.
American economist Tyler Cowen often uses the refrain “Context is that which is scarce” to describe that while art, information and content may be abundant, understanding—the ability to situate that information within a meaningful context—remains a rare and valuable resource. Para-content thrives precisely because it claims to provide this scarce context.
As content proliferates, the challenge isn’t accessing cultural works but understanding how they fit into larger narratives and why they matter. There is simply too much content, context makes salient which deserves our attention.
Your friend’s favorite line in a song became a hook for your own appreciation of it. Seeing how people reacted to a song’s pivotal moment at a house party made clear the song’s high point. Hearing a professor rave about a shot in a movie made you lean in when you watched it. Often, you developed your own unique appreciation for something which you then shared with peers. These are all great examples of organic contextualization. Yet this scarcity of context also illuminates the dangers of para-content. When contextualizers wield disproportionate influence, there is a risk that their exegesis becomes prescriptive rather than suggestive.
The tyranny of the contextualizer online is their constant and immovable presence between the reader and the text, the listener and the music, the viewer and the film. We now reach for context before engaging with the content. When my first interaction with a song is through TikTok reactions, I no longer encounter the work as it is, on my own. It comes with context juxtaposed, pre-packaged. This removes the public’s ability to construct, even if for a moment, their own unique horizons.
·taste101.substack.com·
The Age of Para-Content
The age of being 'very online' is over. Here's why.
The age of being 'very online' is over. Here's why.
Izzy recently decided to stop using X and her decision was based on the app's algorithm: "It feels like the algorithm wants you to see stuff you don't like so that you engage with it and it also shows your stuff to people who won't like it," she says, explaining that this was making her experience of using social media almost entirely negative.
"The follower is no longer a peer, they’re the audience, while the creator is more similar to a conventional, mainstream media broadcaster than to an independent creator."
Izzy agrees that this has been one of the biggest changes in her experience of using social media during the past decade: "I do think brands and influencers dominate my social media a lot more - it's constantly ads on my feed. I choose to follow my friends and often I don't see their stuff," she says.
It reflects the lack of space for genuine interaction and meaningful communities online right now, something that was once considered to be one of the main plus sides of social media.
"There aren't really niche internet jokes anymore because you have trend forecasters and people whose jobs it is to hop on these trends and make it about a brand," Izzy says adding: "The memes aren't as funny when you know they're going to be co-opted."
·mashable.com·
The age of being 'very online' is over. Here's why.
Taste is Eating Silicon Valley.
Taste is Eating Silicon Valley.
The lines between technology and culture are blurring. And so, it’s no longer enough to build great tech.
Whether in expressed via product design, brand, or user experience, taste now defines how a product is perceived and felt as well as how it is adopted, i.e. distributed — whether it’s software or hardware or both. Technology has become deeply intertwined with culture.3 People now engage with technology as part of their lives, no matter their location, career, or status.
founders are realizing they have to do more than code, than be technical. Utility is always key, but founders also need to calibrate design, brand, experience, storytelling, community — and cultural relevance. The likes of Steve Jobs and Elon Musk are admired not just for their technical innovations but for the way they turned their products, and themselves, into cultural icons.
The elevation of taste invites a melting pot of experiences and perspectives into the arena — challenging “legacy” Silicon Valley from inside and outside.
B2C sectors that once prioritized functionality and even B2B software now feel the pull of user experience, design, aesthetics, and storytelling.
Arc is taking on legacy web browsers with design and brand as core selling points. Tools like Linear, a project management tool for software teams, are just as known for their principled approach to company building and their heavily-copied landing page design as they are known for their product’s functionality.4 Companies like Arc and Linear build an entire aesthetic ecosystem that invites users and advocates to be part of their version of the world, and to generate massive digital and literal word-of-mouth. (Their stories are still unfinished but they stand out among this sector in Silicon Valley.)
Any attempt to give examples of taste will inevitably be controversial, since taste is hard to define and ever elusive. These examples are pointing at narratives around taste within a community.
So how do they compete? On how they look, feel, and how they make users feel.6 The subtleties of interaction (how intuitive, friendly, or seamless the interface feels) and the brand aesthetic (from playful websites to marketing messages) are now differentiators, where users favor tools aligned with their personal values. All of this should be intertwined in a product, yet it’s still a noteworthy distinction.
Investors can no longer just fund the best engineering teams and wait either. They’re looking for teams that can capture cultural relevance and reflect the values, aesthetics, and tastes of their increasingly diverse markets.
How do investors position themselves in this new landscape? They bet on taste-driven founders who can capture the cultural zeitgeist. They build their own personal and firm brands too. They redesign their websites, write manifestos, launch podcasts, and join forces with cultural juggernauts.
Code is cheap. Money now chases utility wrapped in taste, function sculpted with beautiful form, and technology framed in artistry.
The dictionary says it’s the ability to discern what is of good quality or of a high aesthetic standard. Taste bridges personal choice (identity), societal standards (culture), and the pursuit of validation (attention). But who sets that standard? Taste is subjective at an individual level — everyone has their own personal interpretation of taste — but it is calibrated from within a given culture and community.
Taste manifests as a combination of history, design, user experience, and embedded values that creates emotional resonance — that defines how a product connects with people as individuals and aligns with their identity. None of the tactical things alone are taste; they’re mere artifacts or effects of expressing one’s taste. At a minimum, taste isn’t bland — it’s opinionated.
The most compelling startups will be those that marry great tech with great taste. Even the pursuit of unlocking technological breakthroughs must be done with taste and cultural resonance in mind, not just for the sake of the technology itself. Taste alone won’t win, but you won’t win without taste playing a major role.
Founders must now master cultural resonance alongside technical innovation.
In some sectors—like frontier AI, deep tech, cybersecurity, industrial automation—taste is still less relevant, and technical innovation remains the main focus. But the footprint of sectors where taste doesn’t play a big role is shrinking. The most successful companies now blend both. Even companies aiming to be mainstream monopolies need to start with a novel opinionated approach.
I think we should leave it at “taste” which captures the artistic and cultural expressions that traditional business language can’t fully convey, reflecting the deep-rooted and intuitive aspects essential for product dev
·workingtheorys.com·
Taste is Eating Silicon Valley.
Make Something Heavy
Make Something Heavy
The modern makers’ machine does not want you to create heavy things. It runs on the internet—powered by social media, fueled by mass appeal, and addicted to speed. It thrives on spikes, scrolls, and screenshots. It resists weight and avoids friction. It does not care for patience, deliberation, or anything but production. It doesn’t care what you create, only that you keep creating. Make more. Make faster. Make lighter. Make something that can be consumed in a breath and discarded just as quickly. Heavy things take time. And here, time is a tax.
even the most successful Substackers—those who’ve turned newsletters into brands and businesses—eventually want to stop stacking things. They want to make one really, really good thing. One truly heavy thing. A book. A manifesto. A movie. A media company. A momument.
At any given time, you’re either pre–heavy thing or post–heavy thing. You’ve either made something weighty already, or you haven’t. Pre–heavy thing people are still searching, experimenting, iterating. Post–heavy thing people have crossed the threshold. They’ve made something substantial—something that commands respect, inspires others, and becomes a foundation to build on. And it shows. They move with confidence and calm. (But this feeling doesn’t always last forever.)
No one wants to stay in light mode forever. Sooner or later, everyone gravitates toward heavy mode—toward making something with weight. Your life’s work will be heavy. Finding the balance of light and heavy is the game.4 Note: heavy doesn’t have to mean “big.” Heavy can be small, niche, hard to scale. What I’m talking about is more like density. It’s about what is defining, meaningful, durable.
Telling everyone they’re a creator has only fostered a new strain of imposter syndrome. Being called a creator doesn’t make you one or make you feel like one; creating something with weight does. When you’ve made something heavy—something that stands on its own—you don’t need validation. You just know, because you feel its weight in your hands.
It’s not that most people can’t make heavy things. It’s that they don’t notice they aren’t. Lightness has its virtues—it pulls us in, subtly, innocently, whispering, 'Just do things.' The machine rewards movement, so we keep going, collecting badges. One day, we look up and realize we’ve been running in place.
Why does it feel bad to stop posting after weeks of consistency? Because the force of your work instantly drops to zero. It was all motion, no mass—momentum without weight. 99% dopamine, near-zero serotonin, and no trace of oxytocin. This is the contemporary creator’s dilemma—the contemporary generation’s dilemma.
We spend our lives crafting weighted blankets for ourselves—something heavy enough to anchor our ambition and quiet our minds.
Online, by nature, weight is harder to find, harder to hold on to, and only getting harder in a world where it feels like anyone can make anything.
·workingtheorys.com·
Make Something Heavy
When was the last time you felt consensus?
When was the last time you felt consensus?
Biederman so succinctly put it, at some point between the first Trump administration and the second, “Article World” was defeated by “Post World”. As he sees it, “Article World” is the universe of American corporate journalism and punditry that, well, basically held up liberal democracy in this country since the invention of the radio. And “Post World” is everything the internet has allowed to flourish since the invention of the smartphone — YouTubers, streamers, influencers, conspiracy theorists, random trolls, bloggers, and, of course, podcasters. And now huge publications and news channels are finally noticing that Article World, with all its money and resources and prestige, has been reduced to competing with random posts that both voters and government officials happen to see online.
during the first Trump administration, the president’s various henchmen would do something illegal or insane, a reporter would find out, cable news and newspapers would cover it nonstop, and usually that henchman would resign or, oftentimes, end up in jail
this is why the media is typically called the fourth branch of the government
This also explains why Texas is trying to pass the “Forbidden Unlawful Representation of Roleplaying in Education,” or “FURRIES” Act, based on a years-old anti-trans internet conspiracy theory. It’s why Trump’s team is targeting former President Joe Biden’s autopen-signed pardons after the idea surfaced in a viral X post shared by Libs Of TikTok. And it’s why US Secretary of Defense Pete Hegseth is investigating random social media reports that military bases are still letting personnel list their preferred pronouns on different forms. Posts are all that matters now. And it’s likely no amount of articles can defeat them. Well, I guess we’ll find out.
When was the last time you truly felt consensus? Not in the sense that a trend was happening around you — although, was it? — but a new fact or bit of information that felt universally agreed upon?
·garbageday.email·
When was the last time you felt consensus?
Revenge of the junior developer | Sourcegraph Blog
Revenge of the junior developer | Sourcegraph Blog
with agents, you don’t have to do all the ugly toil of bidirectional copy/paste and associated prompting, which is the slow human-y part. Instead, the agent takes over and handles that for you, only returning to chat with you when it finishes or gets stuck or you run out of cash.
As fast and robust as they may be, you still need to break things down and shepherd coding agents carefully. If you give one a task that’s too big, like "Please fix all my JIRA tickets", it will hurl itself at the problem and get almost nowhere. They require careful supervision and thoughtful problem selection today. In short, they are ornery critters.
it’s not all doom and gloom ahead. Far from it! There will be a bunch of jobs in the software industry. Just not the kind that involve writing code by hand like some sort of barbarian.
But for the most part, junior developers – including (a) newly-minted devs, (b) devs still in school, and (c) devs who are still thinkin’ about school – are all picking this stuff up really fast. They grab the O’Reilly AI Engineering book, which all devs need to know cover to cover now, and they treat it as job training. They’re all using chat coding, they all use coding assistants, and I know a bunch of you junior developers out there are using coding agents already.
I believe the AI-refusers regrettably have a lot invested in the status quo, which they think, with grievous mistakenness, equates to job security. They all tell themselves that the AI has yet to prove that it’s better than they are at performing X, Y, or Z, and therefore, it’s not ready yet.
It’s not AI’s job to prove it’s better than you. It’s your job to get better using AI
·sourcegraph.com·
Revenge of the junior developer | Sourcegraph Blog