Found 121 bookmarks
Newest
Why Storytelling by Tony Fadell
Why Storytelling by Tony Fadell
Steve didn’t just read a script for the presentation. He’d been telling a version of that same story every single day for months and months during development—to us, to his friends, his family. He was constantly working on it, refining it. Every time he’d get a puzzled look or a request for clarification from his unwitting early audience, he’d sand it down, tweak it slightly, until it was perfectly polished.
He talked for a while about regular mobile phones and smartphones and the problems of each before he dove into the features of the new iPhone. He used a technique I later came to call the virus of doubt. It’s a way to get into people’s heads, remind them about a daily frustration, get them annoyed about it all over again. If you can infect them with the virus of doubt—“maybe my experience isn’t as good as I thought, maybe it could be better”—then you prime them for your solution. You get them angry about how it works now so they can get excited about a new way of doing things.
when I say “story,” I don’t just mean words. Your product’s story is its design, its features, images and videos, quotes from customers, tips from reviewers, conversations with support agents. It’s the sum of what people see and feel about this thing that you’ve created.
When you get wrapped up in the “what,” you get ahead of people. You think everyone can see what you see. But they don’t. They haven’t been working on it for weeks, months, years. So you need to pause and clearly articulate the “why” before you can convince anyone to care about the “what.”
That’s the case no matter what you make—even if you sell B2B payments software. Even if you build deep-tech solutions for customers who don’t exist yet. Even if you sell lubricants to a factory that’s been buying the same thing for twenty years.
If your competitors are telling better stories than you, if they’re playing the game and you’re not, then it doesn’t matter if their product is worse. They will get the attention. To any customers, investors, partners, or talent doing a cursory search, they will appear to be the leaders in the category. The more people talk about them, the greater their mind share, and the more people will talk about them.
A good story is an act of empathy. It recognizes the needs of its audience. And it blends facts and feelings so the customer gets enough of both. First you need enough instincts and concrete information that your argument doesn’t feel too floaty and insubstantial. It doesn’t have to be definitive data, but there has to be enough to feel meaty, to convince people that you’re anchored in real facts. But you can overdo it—if your story is only informational, then it’s entirely possible that people will agree with you but decide it’s not compelling enough to act on just yet. Maybe next month. Maybe next year.
So you have to appeal to their emotions—connect with something they care about. Their worries, their fears. Or show them a compelling vision of the future: give a human example. Walk through how a real person will experience this product—their day, their family, their work, the change they’ll experience. Just don’t lean so far into the emotional connection that what you’re arguing for feels novel, but not necessary.
And always remember that your customers’ brains don’t always work like yours. Sometimes your rational argument will make an emotional connection. Sometimes your emotional story will give people the rational ammunition to buy your product. Certain Nest customers looked at the beautiful thermostat that we lovingly crafted to appeal to their heart and soul and said, “Sure, okay. It’s pretty” and then had a thrilled, emotional reaction to the potential of saving twenty-three dollars on their energy bill.
everyone will read your story differently. That’s why analogies can be such a useful tool in storytelling. They create a shorthand for complicated concepts—a bridge directly to a common experience.
That’s another thing I learned from Steve Jobs. He’d always say that analogies give customers superpowers. A great analogy allows a customer to instantly grasp a difficult feature and then describe that feature to others. That’s why “1,000 songs in your pocket” was so powerful. Everyone had CDs and tapes in bulky players that only let you listen to 10-15 songs, one album at a time. So “1,000 songs in your pocket” was an incredible contrast—it let people visualize this intangible thing—all the music they loved all together in one place, easy to find, easy to hold—and gave them a way to tell their friends and family why this new iPod thing was so cool.
Because to truly understand many of the features of our products, you’d need a deep well of knowledge about HVAC systems and power grids and the way smoke refracts through a laser to detect fire—knowledge almost nobody had. So we cheated. We didn’t try to explain everything. We just used an analogy. I remember there was one complex feature that was designed to lighten the load on power plants on the hottest or coldest days of the year when everyone cranked up the heat or AC at once. It usually came down to just a few hours in the afternoon, a few days a year—one or more coal power plants would be brought on line to avoid blackouts. So we designed a feature that predicted when these moments would come, then the Nest Thermostat would crank the AC or heat up extra before the crucial peak hours and turn it down when everyone else was turning it up. Anyone who signed up for the program got a credit on their energy bill. As more and more people joined the program, the result was a win-win—people stayed comfortable, they saved money, and the energy companies didn’t have to turn on their dirtiest plants. And that is all well and good, but it just took me 150 word to explain. So after countless hours of thinking about it and trying all the possible solutions, we settled on doing it in three: Rush Hour Rewards.
Everyone understands the concept of rush hour—the moment when way too many people get on the road together and traffic slows to a creep. Same thing happens with energy. We didn’t need to explain much more than that—rush hours are a problem, but when there’s an energy rush hour, you can get something out of it. You can get a reward. You can actually save money rather than getting stuck with everyone else.
Quick stories are easy to remember. And, more importantly, easy to repeat. Someone else telling your story will always reach more people and do more to convince them to buy your product than any amount of talking you do about yourself on your own platforms. You should always be striving to tell a story so good that it stops being yours—so your customer learns it, loves it, internalizes it, owns it. And tells it to everyone they know.
A good product story has three elements: It appeals to people’s rational and emotional sides. It takes complicated concepts and makes them simple. It reminds people of the problem that’s being solved—it focuses on the “why.”
·founderstribune.org·
Why Storytelling by Tony Fadell
Just How Queer Is Luca Guadagnino’s Queer Anyway?
Just How Queer Is Luca Guadagnino’s Queer Anyway?
Guadagnino reminded me that as we come of age, we decide for ourselves what informs us, and spoke to the first time he read Burroughs. “You enter into the language of Burroughs and you understand, at 17 years old, that there are ways we can express ourselves that are so wide, sophisticated, complicated, and that you never have to adapt to a logic that is preordained.”
Burroughs in fact traveled there in 1952; The Yage Letters chronicles his experiments in his letters to Ginsberg. He was obsessed with the idea that yage could enhance telepathy. In the hallucinatory new scenes, the connection between Lee and Allerton goes to places the earthbound book could never take it.
When the screenplay is his own, firmly in Guadagnino’s hands, it’s actually fabulous — and a relief after the earlier conflict between the director and his material. At the same time, it makes no sense. That’s the most Burroughsian nod in this film: the sheer randomness and trippy outrageousness of the end. It’s very Naked Lunch — both the book and David Cronenberg’s 1991 film inspired by Burroughs, which was clearly on Guadagnino’s mind.
It’s paying more of a tribute to an adaptation of a different Burroughs book, a film that feels genuinely Burroughsian but has less of a basis in the underlying text than his own. Something is off, the essential is missing, and this may be why I didn’t feel Burroughs’s spirit.
still, I wept through scenes of Guadagnino’s film — including a hallucinatory reference to Joan’s death in which Lee does the same failed William Tell routine with Allerton — but it wasn’t for Joan or Burroughs; it was for James’s lover Michael Emerton, who  killed himself with a   gun. I wept as this beautifully designed movie, with gorgeous men in well-cut suits, gave me time to think about the karmic connections that both blessed and cursed me. I wept for Billy Jr., whose mother Burroughs had killed. Then I wept for Burroughs, and I wept for Joan.
I wept for the portrayal of transactional sex that was the “romance” the director referred to. I wept as I questioned notions of intent and integrity in transactional relationships: mine with younger, troubled men who lived on the fringes of gay culture; Burroughs’s with James; and James’s with me. Those relationships, for better or worse, follow the karmic path laid down for me 40-plus years ago. That karma, at least for me, as I flew through the past making sense of it, was neutralized by the acceptance of its very existence, its painful impact on me and those affected by it, and, finally, by releasing it. That was Guadagnino’s gift to me.
Most poignantly, I wept for James, who lives alone, unable to walk, with a brain injury that was inflicted during a gay bashing and made worse by his falls at home and sustaining further concussions. But there has been some nice news for him, as a double LP of his work as a singer-songwriter is being released on Lotuspool Records. And he told me he liked Guadagnino’s Queer — though he quibbled with the casting and look of Allerton — and that’s even better news. Guadagnino liked hearing that
On the Zoom with Guadagnino and Anderson, I wanted to ask about legacy. Are there responsibilities we who make art or work in the arts have to our elders, to the radical spirits who pushed open the doors? I mentioned the affluent gay men, usually heteronormatively married, who “rent a womb” and maybe buy an egg to drop in it so their children have their genes — all of which seems to me to be the furthest thing from queer. In response, some signifiers were mentioned. Anderson speaks to the look of the film, citing George Platt Lynes’s influence; they both chimed in about Powell and Pressburger (the Archers), of The Red Shoes; I mentioned Rainer Werner Fassbinder’s adaptation of Jean Genet’s Querelle, which Guadagnino said, indeed, influenced him. The point has been missed, and the clock is ticking. I move on, disappointed.
Will this film ignite a radical spark in younger viewers — be they queer or not? That’s what Burroughs did for me and for many, many of his readers
The craftsmanship of the film is sterling on many levels. But it is not the book I know by the writer I knew so well. It is stylish in the modality of fashion — having a “look”; it is beautiful in its entirety as a complete visual construction. It is, essentially, a gay location film. It is romantic, something of a travelogue — you might want to go where it is set, eat at the restaurants, while wearing the clothing, certainly in the company of some of the flawless boys cast. But it is not the world that the book conjures for most readers, certainly not me. This is the work of the director — as any film should be.
Still, a bad match of director and material renders confusion at best, emptiness at worst; I worried that this film could potentially misconstrue the importance of Burroughs’s role as a visionary queer writer for future generations. I was incapable of explaining this to Guadagnino and Anderson, in our 20-minute Zoom, not to mention it might have stopped the interview. But I tried.
It wasn’t just the peculiar casting of a beefy daddy like Daniel Craig as the Burroughs character, William Lee, or pretty Drew Starkey as the aloof, younger love interest, Eugene Allerton, who spends the film looking great in fabulous knitwear by Jonathan Anderson, Guadagnino’s friend and the film’s costume designer, but nothing like the image of the character I had in my head.
·vulture.com·
Just How Queer Is Luca Guadagnino’s Queer Anyway?
The Fury
The Fury
Tracking Esther down at an after-hours club and marvelling at her artistry, he resolves to propel her into pictures. The number she performs at the club, “The Man That Got Away,” is one of the most astonishing, emotionally draining musical productions in Hollywood history, both for Garland’s electric, spontaneous performance and for Cukor’s realization of it. The song itself, by Harold Arlen and Ira Gershwin, is the apotheosis of the torch song, and Garland kicks its drama up to frenzied intensity early on, as much with the searing pathos of her voice as with convulsive, angular gestures that look like an Expressionist painting come to life. (Her fury prefigures the psychodramatic forces unleashed by Gena Rowlands in the films of her husband, John Cassavetes.) Cukor, who had first worked wonders with Garland in the early days of “The Wizard of Oz” (among other things, he removed her makeup, a gesture repeated here by Maine), captures her performance in a single, exquisitely choreographed shot, with the camera dollying back to reveal the band, in shadow, with spotlights gleaming off the bells of brass instruments and the chrome keys of woodwinds.
·newyorker.com·
The Fury
David Shreve: The irony of American political economics
David Shreve: The irony of American political economics
Summary: Shreve analyzes the paradox between economic performance under Democratic versus Republican administrations and public perception of economic competence. He presents substantial statistical evidence showing Democratic administrations consistently outperforming Republican ones across multiple economic metrics, while explaining how Republicans have successfully maintained a reputation for superior economic stewardship through specific messaging strategies and tax policies.
Since 1949, job growth under Democratic presidencies has been more than twice as large as that during Republican administrations (2.47% to 1.07%). Excluding public sector jobs, the advantage is even greater (2.55% to 0.97%). Other key averages reveal a similar distinction during this period: Real business investment growth advanced 6.58% under Democratic presidents and 2.98% under their Republican counterparts; real personal income — excluding government transfers — increased 2.66% and real economic growth per capita (net domestic product) advanced 2.6% under Democratic chief executives, but only by 1.41% and 1.28%, respectively, under Republican leaders. Inflation has also been much more modest under Democratic presidents (2.91% compared to 3.28% under their Republican counterparts), with an even more decided advantage when volatile energy and food markets are excluded (2.87% compared to 3.59%).
Of the 11 U.S. recessions we’ve endured over the past 75 years, 10 began in Republican presidential administrations; only Jimmy Carter — embracing Republican-style fiscal, monetary and regulatory policy much more completely than any other recent Democratic president — presided over a “Democratic” recession. The two “double-dip” recessions of 1980 and 1981-82, straddling the late Carter and early Reagan administrations, are almost indistinguishable in their policy origins.
We are reminded consistently by pundits, journalists and scholars that tax cuts represent what may be our most readily available and useful tool for economic stimulus. Flat, or flatter, taxes, we are told, are the only means to the achievement of tax simplicity and tax compliance.
Even on the question of who tends to favor lower or higher taxes, it is easy to be deceived. When income taxes are reduced (at the federal and state level) and the entire tax code is rendered less progressive as a result, two things happen almost automatically: other much more regressive taxes rise to fill the vacuum created by universally demanded (if not readily acknowledged) public services and consumer demand falters as higher taxes begin to fall on those compelled to spend all that they earn. Overall economic activity and prospective revenue growth, in turn, begin to stagnate, triggering a vicious cycle of tax rate increases (among the untouched regressive tax vehicles), just to maintain public services and economic activity.
Republican politicians have stumbled upon a remarkably effective political strategy: preach tax cuts as the be-all and end-all of successful economic policy; ignore the ways in which federal income tax cuts often lead to increased tolls, fees and property, sales, and excise tax increases; relinquish all but rhetorical opposition to the federal deficits created by federal tax cuts; and cap it off by hinting repeatedly that more could be done — allegedly to great effect — by reducing government spending directed at “undeserving” and “unambitious” poor people of color.
Republican political leaders have their cake and eat it too, riding a diffuse anti-tax sentiment to political victory. Actual results in this game don’t often matter, at least as long as their Democratic opponents succeed in staving off the most precipitous decline with safety nets and the preservation of some progressive fiscal policy elements.
Begun quietly with what Republican activist and Wall Street Journal editor Jude Wanniski called the “Two Santa Claus Theory” — under which Republicans could counter the Democratic social spending Santa Claus with their own tax-cutting Kris Kringle — this approach promised political “success” even amid policy failure, for opponents could be pinned with the deficits and damage it produced.
Exploiting normal psychological tendencies to imagine that “more money in my pocket” and “less money in theirs” simply must be good policy, the widespread ignorance of actual public spending and significant intergovernmental fiscal policies (where federal change forces state and local change, or vice versa), and the compelling notion that personal economic opportunity or success must be derived from personal talent and initiative (rather than significant public policy reform), the “Two Santa Claus” strategy has buoyed a Republican Party that has consistently delivered sub-par results.
·dailyprogress.com·
David Shreve: The irony of American political economics
Bernie Would Have Won
Bernie Would Have Won

AI summary: This article argues that Trump's 2024 victory represents the triumph of right-wing populism over neoliberalism, enabled by Democratic Party leadership's deliberate suppression of Bernie Sanders' left-wing populist movement. The piece contends that by rejecting class-focused politics in favor of identity politics and neoliberal policies, Democrats created a vacuum that Trump's authoritarian populism filled.

Here’s a warning and an admonition written in January 2019 by author and organizer Jonathan Smucker: “If the Dem Party establishment succeeds in beating down the fresh leadership and bold vision that's stepping up, it will effectively enable the continued rise of authoritarianism. But they will not wake up and suddenly grasp this. It's on us to outmaneuver them and win.”
There are a million surface-level reasons for Kamala Harris’s loss and systematic underperformance in pretty much every county and among nearly every demographic group. She is part of a deeply unpopular administration. Voters believe the economy is bad and that the country is on the wrong track. She is a woman and we still have some work to do as a nation to overcome long-held biases.  But the real problems for the Democrats go much deeper and require a dramatic course correction of a sort that, I suspect, Democrats are unlikely to embark upon. The bottom line is this: Democrats are still trying to run a neoliberal campaign in a post-neoliberal era. In other words, 2016 Bernie was right.
The lie that fueled the Iraq war destroyed confidence in the institutions that were the bedrock of this neoliberal order and in the idea that the U.S. could or should remake the world in our image. Even more devastating, the financial crisis left home owners destitute while banks were bailed out, revealing that there was something deeply unjust in a system that placed capital over people.
These events sparked social movements on both the right and the left. The Tea Party churned out populist-sounding politicians like Sarah Palin and birtherist conspiracies about Barack Obama, paving the way for the rise of Donald Trump. The Tea Party and Trumpism are not identical, of course, but they share a cast of villains: The corrupt bureaucrats or deep state. The immigrants supposedly changing your community. The cultural elites telling you your beliefs are toxic. Trump’s version of this program is also explicitly authoritarian. This authoritarianism is a feature not a bug for some portion of the Trump coalition which has been persuaded that democracy left to its own devices could pose an existential threat to their way of life.
On the left, the organic response to the financial crisis was Occupy Wall Street, which directly fueled the Bernie Sanders movement. Here, too, the villains were clear. In the language of Occupy it was the 1% or as Bernie put it the millionaires and billionaires. It was the economic elite and unfettered capitalism that had made it so hard to get by. Turning homes into assets of financial speculation. Wildly profiteering off of every element of our healthcare system. Busting unions so that working people had no collective power. This movement was, in contrast to the right, was explicitly pro-democracy, with a foundational view that in a contest between the 99% and the 1%, the 99% would prevail. And that a win would lead to universal programs like Medicare for All, free college, workplace democracy, and a significant hike in the minimum wage.
On the Republican side, Donald Trump emerged as a political juggernaut at a time when the party was devastated and rudderless, having lost to Obama twice in a row. This weakened state—and the fact that the Trump alternatives were uncharismatic drips like Jeb Bush—created a path for Trump to successfully execute a hostile takeover of the party.
Plus, right-wing populism embraces capital, and so it posed no real threat to the monied interests that are so influential within the party structures.
The Republican donor class was not thrilled with Trump’s chaos and lack of decorum but they did not view him as an existential threat to their class interests
The difference was that Bernie’s party takeover did pose an existential threat—both to party elites who he openly antagonized and to the party’s big money backers. The bottom line of the Wall Street financiers and corporate titans was explicitly threatened. His rise would simply not be allowed. Not in 2016 and not in 2020.
What’s more, Hillary Clinton and her allies launched a propaganda campaign to posture as if they were actually to the left of Bernie by labeling him and his supporters sexist and racist for centering class politics over identity politics. This in turn spawned a hell cycle of woke word-policing and demographic slicing and dicing and antagonism towards working class whites that only made the Democratic party more repugnant to basically everyone.
The path not taken in 2016 looms larger than ever. Bernie’s coalition was filled with the exact type of voters who are now flocking to Donald Trump: Working class voters of all races, young people, and, critically, the much-derided bros. The top contributors to Bernie’s campaign often held jobs at places like Amazon and Walmart. The unions loved him. And—never forget—he earned the coveted Joe Rogan endorsement that Trump also received the day before the election this year. It turns out, the Bernie-to-Trump pipeline is real! While that has always been used as an epithet to smear Bernie and his movement, with the implication that social democracy is just a cover for or gateway drug to right wing authoritarianism, the truth is that this pipeline speaks to the power and appeal of Bernie’s vision as an effective antidote to Trumpism. When these voters had a choice between Trump and Bernie, they chose Bernie. For many of them now that the choice is between Trump and the dried out husk of neoliberalism, they’re going Trump.
Maybe I will be just as wrong as I was about the election but it is my sense that with this Trump victory, authoritarian right politics have won the ideological battle for what will replace the neoliberal order in America. And yes, I think it will be ugly, mean, and harmful—because it already is.
·dropsitenews.com·
Bernie Would Have Won
The Only Reason to Explore Space
The Only Reason to Explore Space

Claude summary: > This article argues that the only enduring justification for space exploration is its potential to fundamentally transform human civilization and our understanding of ourselves. The author traces the history of space exploration, from the mystical beliefs of early rocket pioneers to the geopolitical motivations of the Space Race, highlighting how current economic, scientific, and military rationales fall short of sustaining long-term commitment. The author contends that achieving interstellar civilization will require unprecedented organizational efforts and societal commitment, likely necessitating institutions akin to governments or religions. Ultimately, the piece suggests that only a society that embraces the pursuit of interstellar civilization as its central legitimating project may succeed in this monumental endeavor, framing space exploration not as an inevitable outcome of progress, but as a deliberate choice to follow a "golden path to a destiny among the stars."

·palladiummag.com·
The Only Reason to Explore Space
Shop Class as Soulcraft
Shop Class as Soulcraft

Summary: Skilled manual labor entails a systematic encounter with the material world that can enrich one's intellectual and spiritual life. The degradation of work in both blue-collar and white-collar professions is driven not just by technological progress, but by the separation of thinking from doing according to the dictates of capital. To realize the full potential of human flourishing, we must reckon with the appeal of skilled manual work and question the assumptions that shape our educational priorities and notions of a good life.

an engineering culture has developed in recent years in which the object is to “hide the works,” rendering the artifacts we use unintelligible to direct inspection. Lift the hood on some cars now (especially German ones), and the engine appears a bit like the shimmering, featureless obelisk that so enthralled the cavemen in the opening scene of the movie 2001: A Space Odyssey. Essentially, there is another hood under the hood.
What ordinary people once made, they buy; and what they once fixed for themselves, they replace entirely or hire an expert to repair, whose expert fix often involves installing a pre-made replacement part.
So perhaps the time is ripe for reconsideration of an ideal that has fallen out of favor: manual competence, and the stance it entails toward the built, material world. Neither as workers nor as consumers are we much called upon to exercise such competence, most of us anyway, and merely to recommend its cultivation is to risk the scorn of those who take themselves to be the most hard-headed: the hard-headed economist will point out the opportunity costs of making what can be bought, and the hard-headed educator will say that it is irresponsible to educate the young for the trades, which are somehow identified as the jobs of the past.
It was an experience of agency and competence. The effects of my work were visible for all to see, so my competence was real for others as well; it had a social currency. The well-founded pride of the tradesman is far from the gratuitous “self-esteem” that educators would impart to students, as though by magic.
Skilled manual labor entails a systematic encounter with the material world, precisely the kind of encounter that gives rise to natural science. From its earliest practice, craft knowledge has entailed knowledge of the “ways” of one’s materials — that is, knowledge of their nature, acquired through disciplined perception and a systematic approach to problems.
Because craftsmanship refers to objective standards that do not issue from the self and its desires, it poses a challenge to the ethic of consumerism, as the sociologist Richard Sennett has recently argued. The craftsman is proud of what he has made, and cherishes it, while the consumer discards things that are perfectly serviceable in his restless pursuit of the new.
The central culprit in Braverman’s account is “scientific management,” which “enters the workplace not as the representative of science, but as the representative of management masquerading in the trappings of science.” The tenets of scientific management were given their first and frankest articulation by Frederick Winslow Taylor
Scattered craft knowledge is concentrated in the hands of the employer, then doled out again to workers in the form of minute instructions needed to perform some part of what is now a work process. This process replaces what was previously an integral activity, rooted in craft tradition and experience, animated by the worker’s own mental image of, and intention toward, the finished product. Thus, according to Taylor, “All possible brain work should be removed from the shop and centered in the planning or lay-out department.” It is a mistake to suppose that the primary purpose of this partition is to render the work process more efficient. It may or may not result in extracting more value from a given unit of labor time. The concern is rather with labor cost. Once the cognitive aspects of the job are located in a separate management class, or better yet in a process that, once designed, requires no ongoing judgment or deliberation, skilled workers can be replaced with unskilled workers at a lower rate of pay.
the “jobs of the future” rhetoric surrounding the eagerness to end shop class and get every warm body into college, thence into a cubicle, implicitly assumes that we are heading to a “post-industrial” economy in which everyone will deal only in abstractions. Yet trafficking in abstractions is not the same as thinking. White collar professions, too, are subject to routinization and degradation, proceeding by the same process as befell manual fabrication a hundred years ago: the cognitive elements of the job are appropriated from professionals, instantiated in a system or process, and then handed back to a new class of workers — clerks — who replace the professionals. If genuine knowledge work is not growing but actually shrinking, because it is coming to be concentrated in an ever-smaller elite, this has implications for the vocational advice that students ought to receive.
The trades are then a natural home for anyone who would live by his own powers, free not only of deadening abstraction, but also of the insidious hopes and rising insecurities that seem to be endemic in our current economic life. This is the stoic ideal.
·thenewatlantis.com·
Shop Class as Soulcraft
Berger’s Books
Berger’s Books
The cover immediately sets Ways of Seeing apart from its contemporaries, the book itself begins on the cover. Rather than creating a conventionally appealing cover, Hollis chose to bypass this tradition entirely, instead placing the text and an image from the start of the first chapter straight onto the front, just beneath the title and authors name. This directness has a link with the television series, mimicking how the first episode began with no preamble or title sequence, Berger got started immediately, drawing the audience in with his message rather than any distractions.
Another link to Berger’s presenting style is Hollis’ choice of typeface, bold Univers 65 is used for the body copy throughout, in an attempt to achieve something of the captivating quality of Berger’s voice.
The layout also employs large indents rather than paragraph breaks, something of a Hollis trademark. But this mirrors how Berger had presented on television, there was little time wasted with atmospheric filler shots or long gaps in speech, the message was key and continuous.
The key reason that Ways of Seeing has become iconic as a piece of book design is how it dealt with text and image: the two are integrated, where an image is mentioned in the text it also appears there. Captions are avoided where possible. When unavoidable they are in a lighter weight of type and run horizontally, so as not to disrupt the text. Images are often set at the same width as the lines of text, or indented by the same amount, this democratises the text and image relationship. Occasionally works of art are cropped to show only the pertinent details. All of these features are a big departure from the art books of the time which usually featured glorified full page colour images, often in a glossy ‘colour plate’ section in the middle, completely distanced from where the text refers to them.
Design is not used for prettifying, or to create appeal, rather it is used for elucidating, to spread his message or get his point across as clearly as possible. Be it a point about art and politics, art and gender, the ethics of advertising, the human experiences of a rural GP, or economic migrants in Germany — the design is always appropriate to what Berger wants to say, but does so economically without redundancy.
Even in Portraits: John Berger on Artists published by Verso in 2015, Berger insisted on black and white reproductions, arguing that: “glossy colour reproductions in the consumerist world of today tend to reduce what they show to items in a luxury brochure for millionaires. Whereas black-and-white reproductions are simple memoranda.”
the images in the book “illustrate the essentially dialectical relationship between text and image in Berger’s work: the pattern in which an image shapes a text, which then goes on to shape how we understand that image.”
·theo-inglis.medium.com·
Berger’s Books
My Last Five Years of Work
My Last Five Years of Work
Copywriting, tax preparation, customer service, and many other tasks are or will soon be heavily automated. I can see the beginnings in areas like software development and contract law. Generally, tasks that involve reading, analyzing, and synthesizing information, and then generating content based on it, seem ripe for replacement by language models.
Anyone who makes a living through  delicate and varied movements guided by situation specific know-how can expect to work for much longer than five more years. Thus, electricians, gardeners, plumbers, jewelry makers, hair stylists, as well as those who repair ironwork or make stained glass might find their handiwork contributing to our society for many more years to come
Finally, I expect there to be jobs where humans are preferred to AIs even if the AIs can do the job equally well, or perhaps even if they can do it better. This will apply to jobs where something is gained from the very fact that a human is doing it—likely because it involves the consumer feeling like they have a relationship with the human worker as a human. Jobs that might fall into this category include counselors, doulas, caretakers for the elderly, babysitters, preschool teachers, priests and religious leaders, even sex workers—much has been made of AI girlfriends, but I still expect that a large percentage of buyers of in-person sexual services will have a strong preference for humans. Some have called these jobs “nostalgic jobs.”
It does seem that, overall, unemployment makes people sadder, sicker, and more anxious. But it isn’t clear if this is an inherent fact of unemployment, or a contingent one. It is difficult to isolate the pure psychological effects of being unemployed, because at present these are confounded with the financial effects—if you lose your job, you have less money—which produce stress that would not exist in the context of, say, universal basic income. It is also confounded with the “shame” aspect of being fired or laid off—of not working when you really feel you should be working—as opposed to the context where essentially all workers have been displaced.
One study that gets around the “shame” confounder of unemployment is “A Forced Vacation? The Stress of Being Temporarily Laid Off During a Pandemic” by Scott Schieman, Quan Mai, and Ryu Won Kang. This study looked at Canadian workers who were temporarily laid off several months into the COVID-19 pandemic. They first assumed that such a disruption would increase psychological distress, but instead found that the self-reported wellbeing was more in line with the “forced vacation hypothesis,” suggesting that temporarily laid-off workers might initially experience lower distress due to the unique circumstances of the pandemic.
By May 2020, the distress gap observed in April had vanished, indicating that being temporarily laid off was not associated with higher distress during these months. The interviews revealed that many workers viewed being left without work as a “forced vacation,” appreciating the break from work-related stress and valuing the time for self-care and family. The widespread nature of layoffs normalized the experience, reducing personal blame and fostering a sense of shared experience. Financial strain was mitigated by government support, personal savings, and reduced spending, which buffered against potential distress.
The study suggests that the context and available support systems can significantly alter the psychological outcomes of unemployment—which seems promising for AGI-induced unemployment.
From the studies on plant closures and pandemic layoffs, it seems that shame plays a role in making people unhappy after unemployment, which implies that they might be happier in full automation-induced unemployment, since it would be near-universal and not signify any personal failing.
A final piece that reveals a societal-psychological aspect to how much work is deemed necessary is that the amount has changed over time! The number of hours that people have worked has declined over the past 150 years. Work hours tend to decline as a country gets richer. It seems odd to assume that the current accepted amount of work of roughly 40 hours a week is the optimal amount. The 8-hour work day, weekends, time off—hard-fought and won by the labor movement!—seem to have been triumphs for human health and well-being. Why should we assume that stopping here is right? Why should we assume that less work was better in the past, but less work now would be worse?
Removing the shame that accompanies unemployment by removing the sense that one ought to be working seems one way to make people happier during unemployment. Another is what they do with their free time. Regardless of how one enters unemployment, one still confronts empty and often unstructured time.
One paper, titled “Having Too Little or Too Much Time Is Linked to Lower Subjective Well-Being” by Marissa A. Sharif, Cassie Mogilner, and Hal E. Hershfield tried to explore whether it was possible to have “too much” leisure time.
The paper concluded that it is possible to have too little discretionary time, but also possible to have too much, and that moderate amounts of discretionary time seemed best for subjective well-being. More time could be better, or at least not meaningfully worse, provided it was spent on “social” or “productive” leisure activities. This suggests that how people fare psychologically with their post-AGI unemployment will depend heavily on how they use their time, not how much of it there is
Automation-induced unemployment could feel like retiring depending on how total it is. If essentially no one is working, and no one feels like they should be working, it might be more akin to retirement, in that it would lack the shameful element of feeling set apart from one’s peers.
Women provide another view on whether formal work is good for happiness. Women are, for the most part, relatively recent entrants to the formal labor market. In the U.S., 18% of women were in the formal labor force in 1890. In 2016, 57% were. Has labor force participation made them happier? By some accounts: no. A paper that looked at subjective well-being for U.S. women from the General Social Survey between the 1970s and 2000s—a time when labor force participation was climbing—found both relative and absolute declines in female happiness.
I think women’s work and AI is a relatively optimistic story. Women have been able to automate unpleasant tasks via technological advances, while the more meaningful aspects of their work seem less likely to be automated away.  When not participating in the formal labor market, women overwhelmingly fill their time with childcare and housework. The time needed to do housework has declined over time due to tools like washing machines, dryers, and dishwashers. These tools might serve as early analogous examples of the future effects of AI: reducing unwanted and burdensome work to free up time for other tasks deemed more necessary or enjoyable.
it seems less likely that AIs will so thoroughly automate childcare and child-rearing because this “work” is so much more about the relationship between the parties involved. Like therapy, childcare and teaching seems likely to be one of the forms of work where a preference for a human worker will persist the longest.
In the early modern era, landed gentry and similar were essentially unemployed. Perhaps they did some minor administration of their tenants, some dabbled in politics or were dragged into military projects, but compared to most formal workers they seem to have worked relatively few hours. They filled the remainder of their time with intricate social rituals like balls and parties, hobbies like hunting, studying literature, and philosophy, producing and consuming art, writing letters, and spending time with friends and family. We don’t have much real well-being survey data from this group, but, hedonically, they seem to have been fine. Perhaps they suffered from some ennui, but if we were informed that the great mass of humanity was going to enter their position, I don’t think people would be particularly worried.
I sometimes wonder if there is some implicit classism in people’s worries about unemployment: the rich will know how to use their time well, but the poor will need to be kept busy.
Although a trained therapist might be able to counsel my friends or family through their troubles better, I still do it, because there is value in me being the one to do so. We can think of this as the relational reason for doing something others can do better. I write because sometimes I enjoy it, and sometimes I think it betters me. I know others do so better, but I don’t care—at least not all the time. The reasons for this are part hedonic and part virtue or morality.  A renowned AI researcher once told me that he is practicing for post-AGI by taking up activities that he is not particularly good at: jiu-jitsu, surfing, and so on, and savoring the doing even without excellence. This is how we can prepare for our future where we will have to do things from joy rather than need, where we will no longer be the best at them, but will still have to choose how to fill our days.
·palladiummag.com·
My Last Five Years of Work
The Californian Ideology
The Californian Ideology
Summary: The Californian Ideology is a mix of cybernetics, free market economics, and counter-culture libertarianism that originated in California and has become a global orthodoxy. It asserts that technological progress will inevitably lead to a future of Jeffersonian democracy and unrestrained free markets. However, this ideology ignores the critical role of government intervention in technological development and the social inequalities perpetuated by free market capitalism.
·metamute.org·
The Californian Ideology
The Life and Death of Hollywood, by Daniel Bessner
The Life and Death of Hollywood, by Daniel Bessner
now the streaming gold rush—the era that made Dickinson—is over. In the spring of 2022, the Federal Reserve began raising interest rates after years of nearly free credit, and at roughly the same time, Wall Street began calling in the streamers’ bets. The stock prices of nearly all the major companies with streaming platforms took precipitous falls, and none have rebounded to their prior valuation.
Thanks to decades of deregulation and a gush of speculative cash that first hit the industry in the late Aughts, while prestige TV was climbing the rungs of the culture, massive entertainment and media corporations had been swallowing what few smaller companies remained, and financial firms had been infiltrating the business, moving to reduce risk and maximize efficiency at all costs, exhausting writers in evermore unstable conditions.
The new effective bosses of the industry—colossal conglomerates, asset-management companies, and private-equity firms—had not been simply pushing workers too hard and grabbing more than their fair share of the profits. They had been stripping value from the production system like copper pipes from a house—threatening the sustainability of the studios themselves. Today’s business side does not have a necessary vested interest in “the business”—in the health of what we think of as Hollywood, a place and system in which creativity is exchanged for capital. The union wins did not begin to address this fundamental problem.
To the new bosses, the quantity of money that studios had been spending on developing screenplays—many of which would never be made—was obvious fat to be cut, and in the late Aughts, executives increasingly began offering one-step deals, guaranteeing only one round of pay for one round of work. Writers, hoping to make it past Go, began doing much more labor—multiple steps of development—for what was ostensibly one step of the process. In separate interviews, Dana Stevens, writer of The Woman King, and Robin Swicord described the change using exactly the same words: “Free work was encoded.” So was safe material. In an effort to anticipate what a studio would green-light, writers incorporated feedback from producers and junior executives, constructing what became known as producer’s drafts. As Rodman explained it: “Your producer says to you, ‘I love your script. It’s a great first draft. But I know what the studio wants. This isn’t it. So I need you to just make this protagonist more likable, and blah, blah, blah.’ And you do it.”
By 2019, the major Hollywood agencies had been consolidated into an oligopoly of four companies that controlled more than 75 percent of WGA writers’ earnings. And in the 2010s, high finance reached the agencies: by 2014, private equity had acquired Creative Artists Agency and William Morris Endeavor, and the latter had purchased IMG. Meeting benchmarks legible to the new bosses—deals actually made, projects off the ground—pushed agents to function more like producers, and writers began hearing that their asking prices were too high.
Executives, meanwhile, increasingly believed that they’d found their best bet in “IP”: preexisting intellectual property—familiar stories, characters, and products—that could be milled for scripts. As an associate producer of a successful Aughts IP-driven franchise told me, IP is “sort of a hedge.” There’s some knowledge of the consumer’s interest, he said. “There’s a sort of dry run for the story.” Screenwriter Zack Stentz, who co-wrote the 2011 movies Thor and X-Men: First Class, told me, “It’s a way to take risk out of the equation as much as possible.”
Multiple writers I spoke with said that selecting preexisting characters and cinematic worlds gave executives a type of psychic edge, allowing them to claim a degree of creative credit. And as IP took over, the perceived authority of writers diminished. Julie Bush, a writer-producer for the Apple TV+ limited series Manhunt, told me, “Executives get to feel like the author of the work, even though they have a screenwriter, like me, basically create a story out of whole cloth.” At the same time, the biggest IP success story, the Marvel Cinematic Universe, by far the highest-earning franchise of all time, pioneered a production apparatus in which writers were often separated from the conception and creation of a movie’s overall story.
Joanna Robinson, co-author of the book MCU: The Reign of Marvel Studios, told me that the writers for WandaVision, a Marvel show for Disney+, had to craft almost the entirety of the series’ single season without knowing where their work was ultimately supposed to arrive: the ending remained undetermined, because executives had not yet decided what other stories they might spin off from the show.
The streaming ecosystem was built on a wager: high subscriber numbers would translate to large market shares, and eventually, profit. Under this strategy, an enormous amount of money could be spent on shows that might or might not work: more shows meant more opportunities to catch new subscribers. Producers and writers for streamers were able to put ratings aside, which at first seemed to be a luxury. Netflix paid writers large fees up front, and guaranteed that an entire season of a show would be produced. By the mid-2010s, the sheer quantity of series across the new platforms—what’s known as “Peak TV”—opened opportunities for unusually offbeat projects (see BoJack Horseman, a cartoon for adults about an equine has-been sitcom star), and substantially more shows created by women and writers of color. In 2009, across cable, broadcast, and streaming, 189 original scripted shows aired or released new episodes; in 2016, that number was 496. In 2022, it was 849.
supply soon overshot demand. For those who beat out the competition, the work became much less steady than it had been in the pre-streaming era. According to insiders, in the past, writers for a series had usually been employed for around eight months, crafting long seasons and staying on board through a show’s production. Junior writers often went to the sets where their shows were made and learned how to take a story from the page to the screen—how to talk to actors, how to stay within budget, how to take a studio’s notes—setting them up to become showrunners. Now, in an innovation called mini-rooms, reportedly first ventured by cable channels such as AMC and Starz, fewer writers were employed for each series and for much shorter periods—usually eight to ten weeks but as little as four.
Writers in the new mini-room system were often dismissed before their series went to production, which meant that they rarely got the opportunity to go to set and weren’t getting the skills they needed to advance. Showrunners were left responsible for all writing-related tasks when these rooms shut down. “It broke a lot of showrunners,” the A-list film and TV writer told me. “Physically, mentally, financially. It also ruined a lot of shows.”
The price of entry for working in Hollywood had been high for a long time: unpaid internships, low-paid assistant jobs. But now the path beyond the entry level was increasingly unclear. Jason Grote, who was a staff writer on Mad Men and who came to TV from playwriting, told me, “It became like a hobby for people, or something more like theater—you had your other day jobs or you had a trust fund.” Brenden Gallagher, a TV writer a decade in, said, “There are periods of time where I work at the Apple Store. I’ve worked doing data entry, I’ve worked doing research, I’ve worked doing copywriting.” Since he’d started in the business in 2014, in his mid-twenties, he’d never had more than eight months at a time when he didn’t need a source of income from outside the industry.
“There was this feeling,” the head of the midsize studio told me that day at Soho House, “during the last ten years or so, of, ‘Oh, we need to get more people of color in writers’ rooms.’ ” But what you get now, he said, is the black or Latino person who went to Harvard. “They’re getting the shot, but you don’t actually see a widening of the aperture to include people who grew up poor, maybe went to a state school or not even, and are just really talented. That has not happened at all.”
“The Sopranos does not exist without David Chase having worked in television for almost thirty years,” Blake Masters, a writer-producer and creator of the Showtime series Brotherhood, told me. “Because The Sopranos really could not be written by somebody unless they understood everything about television, and hated all of it.” Grote said much the same thing: “Prestige TV wasn’t new blood coming into Hollywood as much as it was a lot of veterans that were never able to tell these types of stories, who were suddenly able to cut through.”
The threshold for receiving the viewership-based streaming residuals is also incredibly high: a show must be viewed by at least 20 percent of a platform’s domestic subscribers “in the first 90 days of release, or in the first 90 days in any subsequent exhibition year.” As Bloomberg reported in November, fewer than 5 percent of the original shows that streamed on Netflix in 2022 would have met this benchmark. “I am not impressed,” the A-list writer told me in January. Entry-level TV staffing, where more and more writers are getting stuck, “is still a subsistence-level job,” he said. “It’s a job for rich kids.”
Brenden Gallagher, who echoed Conover’s belief that the union was well-positioned to gain more in 2026, put it this way: “My view is that there was a lot of wishful thinking about achieving this new middle class, based around, to paraphrase 30 Rock, making it 1997 again through science or magic. Will there be as big a working television-writer cohort that is making six figures a year consistently living in Los Angeles as there was from 1992 to 2021? No. That’s never going to come back.”
As for what types of TV and movies can get made by those who stick around, Kelvin Yu, creator and showrunner of the Disney+ series American Born Chinese, told me: “I think that there will be an industry move to the middle in terms of safer, four-quadrant TV.” (In L.A., a “four-quadrant” project is one that aims to appeal to all demographics.) “I think a lot of people,” he said, “who were disenfranchised or marginalized—their drink tickets are up.” Indeed, multiple writers and executives told me that following the strike, studio choices have skewed even more conservative than before. “It seems like buyers are much less adventurous,” one writer said. “Buyers are looking for Friends.”
The film and TV industry is now controlled by only four major companies, and it is shot through with incentives to devalue the actual production of film and television.
The entertainment and finance industries spend enormous sums lobbying both parties to maintain deregulation and prioritize the private sector. Writers will have to fight the studios again, but for more sweeping reforms. One change in particular has the potential to flip the power structure of the industry on its head: writers could demand to own complete copyright for the stories they create. They currently have something called “separated rights,” which allow a writer to use a script and its characters for limited purposes. But if they were to retain complete copyright, they would have vastly more leverage. Nearly every writer I spoke with seemed to believe that this would present a conflict with the way the union functions. This point is complicated and debatable, but Shawna Kidman and the legal expert Catherine Fisk—both preeminent scholars of copyright and media—told me that the greater challenge is Hollywood’s structure. The business is currently built around studio ownership. While Kidman found the idea of writer ownership infeasible, Fisk said it was possible, though it would be extremely difficult. Pushing for copyright would essentially mean going to war with the studios. But if things continue on their current path, writers may have to weigh such hazards against the prospect of the end of their profession. Or, they could leave it all behind.
·harpers.org·
The Life and Death of Hollywood, by Daniel Bessner
The most hated workplace software on the planet
The most hated workplace software on the planet
LinkedIn, Reddit, and Blind abound with enraged job applicants and employees sharing tales of how difficult it is to book paid leave, how Kafkaesque it is to file an expense, how nerve-racking it is to close out a project. "I simply hate Workday. Fuck them and those who insist on using it for recruitment," one Reddit user wrote. "Everything is non-intuitive, so even the simplest tasks leave me scratching my head," wrote another. "Keeping notes on index cards would be more effective." Every HR professional and hiring manager I spoke with — whose lives are supposedly made easier by Workday — described Workday with a sense of cosmic exasperation.
If candidates hate Workday, if employees hate Workday, if HR people and managers processing and assessing those candidates and employees through Workday hate Workday — if Workday is the most annoying part of so many workers' workdays — how is Workday everywhere? How did a software provider so widely loathed become a mainstay of the modern workplace?
This is a saying in systems thinking: The purpose of a system is what it does (POSIWID), not what it fails to do. And the reality is that what Workday — and its many despised competitors — does for organizations is far more important than the anguish it causes everyone else.
In 1988, PeopleSoft, backed by IBM, built the first fully fledged Human Resources Information System. In 2004, Oracle acquired PeopleSoft for $10.3 billion. One of its founders, David Duffield, then started a new company that upgraded PeopleSoft's model to near limitless cloud-based storage — giving birth to Workday, the intractable nepo baby of HR software.
Workday is indifferent to our suffering in a job hunt, because we aren't Workday's clients, companies are. And these companies — from AT&T to Bank of America to Teladoc — have little incentive to care about your application experience, because if you didn't get the job, you're not their responsibility. For a company hiring and onboarding on a global scale, it is simply easier to screen fewer candidates if the result is still a single hire.
A search on a job board can return hundreds of listings for in-house Workday consultants: IT and engineering professionals hired to fix the software promising to fix processes.
For recruiters, Workday also lacks basic user-interface flexibility. When you promise ease-of-use and simplicity, you must deliver on the most basic user interactions. And yet: Sometimes searching for a candidate, or locating a candidate's status feels impossible. This happens outside of recruiting, too, where locating or attaching a boss's email to approve an expense sheet is complicated by the process, not streamlined. Bureaucratic hell is always about one person's ease coming at the cost of someone else's frustration, time wasted, and busy work. Workday makes no exceptions.
Workday touts its ability to track employee performance by collecting data and marking results, but it is employees who must spend time inputting this data. A creative director at a Fortune 500 company told me how in less than two years his company went "from annual reviews to twice-annual reviews to quarterly reviews to quarterly reviews plus separate twice-annual reviews." At each interval higher-ups pressed HR for more data, because they wanted what they'd paid for with Workday: more work product. With a press of a button, HR could provide that, but the entire company suffered thousands more hours of busy work. Automation made it too easy to do too much. (Workday's "customers choose the frequency at which they conduct reviews, not Workday," said the spokesperson.)
At the scale of a large company, this is simply too much work to expect a few people to do and far too user-specific to expect automation to handle well. It's why Workday can be the worst while still allowing that Paychex is the worst, Paycom is the worst, Paycor is the worst, and Dayforce is the worst. "HR software sucking" is a big tent.
Workday finds itself between enshittification steps two and three. The platform once made things faster, simpler for workers. But today it abuses workers by cutting corners on job-application and reimbursement procedures. In the process, it provides the value of a one-stop HR shop to its paying customers. It seems it's only a matter of time before Workday and its competitors try to split the difference and cut those same corners with the accounts that pay their bills.
Workday reveals what's important to the people who run Fortune 500 companies: easily and conveniently distributing busy work across large workforces. This is done with the arbitrary and perfunctory performance of work tasks (like excessive reviews) and with the throttling of momentum by making finance and HR tasks difficult. If your expenses and reimbursements are difficult to file, that's OK, because the people above you don't actually care if you get reimbursed. If it takes applicants 128% longer to apply, the people who implemented Workday don't really care. Throttling applicants is perhaps not intentional, but it's good for the company.
·businessinsider.com·
The most hated workplace software on the planet
Western Music Isn't What You Think
Western Music Isn't What You Think
Western culture and music have been heavily influenced by outside, non-Western sources, contrary to common perceptions. The author argues that diversity and cross-cultural exchange are key strengths of Western culture.
·honest-broker.com·
Western Music Isn't What You Think
Looking for AI use-cases — Benedict Evans
Looking for AI use-cases — Benedict Evans
  • LLMs have impressive capabilities, but many people struggle to find immediate use-cases that match their own needs and workflows.
  • Realizing the potential of LLMs requires not just technical advancements, but also identifying specific problems that can be automated and building dedicated applications around them.
  • The adoption of new technologies often follows a pattern of initially trying to fit them into existing workflows, before eventually changing workflows to better leverage the new tools.
if you had showed VisiCalc to a lawyer or a graphic designer, their response might well have been ‘that’s amazing, and maybe my book-keeper should see this, but I don’t do that’. Lawyers needed a word processor, and graphic designers needed (say) Postscript, Pagemaker and Photoshop, and that took longer.
I’ve been thinking about this problem a lot in the last 18 months, as I’ve experimented with ChatGPT, Gemini, Claude and all the other chatbots that have sprouted up: ‘this is amazing, but I don’t have that use-case’.
A spreadsheet can’t do word processing or graphic design, and a PC can do all of those but someone needs to write those applications for you first, one use-case at a time.
no matter how good the tech is, you have to think of the use-case. You have to see it. You have to notice something you spend a lot of time doing and realise that it could be automated with a tool like this.
Some of this is about imagination, and familiarity. It reminds me a little of the early days of Google, when we were so used to hand-crafting our solutions to problems that it took time to realise that you could ‘just Google that’.
This is also, perhaps, matching a classic pattern for the adoption of new technology: you start by making it fit the things you already do, where it’s easy and obvious to see that this is a use-case, if you have one, and then later, over time, you change the way you work to fit the new tool.
The concept of product-market fit is that normally you have to iterate your idea of the product and your idea of the use-case and customer towards each other - and then you need sales.
Meanwhile, spreadsheets were both a use-case for a PC and a general-purpose substrate in their own right, just as email or SQL might be, and yet all of those have been unbundled. The typical big company today uses hundreds of different SaaS apps, all them, so to speak, unbundling something out of Excel, Oracle or Outlook. All of them, at their core, are an idea for a problem and an idea for a workflow to solve that problem, that is easier to grasp and deploy than saying ‘you could do that in Excel!’ Rather, you instantiate the problem and the solution in software - ‘wrap it’, indeed - and sell that to a CIO. You sell them a problem.
there’s a ‘Cambrian Explosion’ of startups using OpenAI or Anthropic APIs to build single-purpose dedicated apps that aim at one problem and wrap it in hand-built UI, tooling and enterprise sales, much as a previous generation did with SQL.
Back in 1982, my father had one (1) electric drill, but since then tool companies have turned that into a whole constellation of battery-powered electric hole-makers. One upon a time every startup had SQL inside, but that wasn’t the product, and now every startup will have LLMs inside.
people are still creating companies based on realising that X or Y is a problem, realising that it can be turned into pattern recognition, and then going out and selling that problem.
A GUI tells the users what they can do, but it also tells the computer everything we already know about the problem, and with a general-purpose, open-ended prompt, the user has to think of all of that themselves, every single time, or hope it’s already in the training data. So, can the GUI itself be generative? Or do we need another whole generation of Dan Bricklins to see the problem, and then turn it into apps, thousands of them, one at a time, each of them with some LLM somewhere under the hood?
The change would be that these new use-cases would be things that are still automated one-at-a-time, but that could not have been automated before, or that would have needed far more software (and capital) to automate. That would make LLMs the new SQL, not the new HAL9000.
·ben-evans.com·
Looking for AI use-cases — Benedict Evans
How McKinsey Destroyed the Middle Class - The Atlantic
How McKinsey Destroyed the Middle Class - The Atlantic

The rise of management consulting firms like McKinsey played a pivotal role in disempowering the American middle class by promoting corporate restructuring that concentrated power and wealth in the hands of elite managers while stripping middle managers and workers of their decision-making roles, job security, and opportunities for career advancement.

Key topics:

  • Management consulting's role in reshaping corporate America
  • The decline of the middle class and the rise of corporate elitism
  • McKinsey's influence on corporate restructuring and inequality
  • The shift from lifetime employment to precarious jobs
  • The erosion of corporate social responsibility
  • The role of management consulting in perpetuating economic inequality
what consequences has the rise of management consulting had for the organization of American business and the lives of American workers? The answers to these questions put management consultants at the epicenter of economic inequality and the destruction of the American middle class.
Managers do not produce goods or deliver services. Instead, they plan what goods and services a company will provide, and they coordinate the production workers who make the output. Because complex goods and services require much planning and coordination, management (even though it is only indirectly productive) adds a great deal of value. And managers as a class capture much of this value as pay. This makes the question of who gets to be a manager extremely consequential.
In the middle of the last century, management saturated American corporations. Every worker, from the CEO down to production personnel, served partly as a manager, participating in planning and coordination along an unbroken continuum in which each job closely resembled its nearest neighbor.
Even production workers became, on account of lifetime employment and workplace training, functionally the lowest-level managers. They were charged with planning and coordinating the development of their own skills to serve the long-run interests of their employers.
At McDonald’s, Ed Rensi worked his way up from flipping burgers in the 1960s to become CEO. More broadly, a 1952 report by Fortune magazine found that two-thirds of senior executives had more than 20 years’ service at their current companies.
Top executives enjoyed commensurately less control and captured lower incomes. This democratic approach to management compressed the distribution of income and status. In fact, a mid-century study of General Motors published in the Harvard Business Review—completed, in a portent of what was to come, by McKinsey’s Arch Patton—found that from 1939 to 1950, hourly workers’ wages rose roughly three times faster than elite executives’ pay. The management function’s wide diffusion throughout the workforce substantially built the mid-century middle class.
The earliest consultants were engineers who advised factory owners on measuring and improving efficiency at the complex factories required for industrial production. The then-leading firm, Booz Allen, did not achieve annual revenues of $2 million until after the Second World War. McKinsey, which didn’t hire its first Harvard M.B.A. until 1953, retained a diffident and traditional ethos
A new ideal of shareholder primacy, powerfully championed by Milton Friedman in a 1970 New York Times Magazine article entitled “The Social Responsibility of Business is to Increase its Profits,” gave the newly ambitious management consultants a guiding purpose. According to this ideal, in language eventually adopted by the Business Roundtable, “the paramount duty of management and of boards of directors is to the corporation’s stockholders.” During the 1970s, and accelerating into the ’80s and ’90s, the upgraded management consultants pursued this duty by expressly and relentlessly taking aim at the middle managers who had dominated mid-century firms, and whose wages weighed down the bottom line.
Management consultants thus implemented and rationalized a transformation in the American corporation. Companies that had long affirmed express “no layoff” policies now took aim at what the corporate raider Carl Icahn, writing in the The New York Times in the late 1980s, called “corporate bureaucracies” run by “incompetent” and “inbred” middle managers. They downsized in response not to particular business problems but rather to a new managerial ethos and methods; they downsized when profitable as well as when struggling, and during booms as well as busts.
Downsizing was indeed wrenching. When IBM abandoned lifetime employment in the 1990s, local officials asked gun-shop owners around its headquarters to close their stores while employees absorbed the shock.
In some cases, downsized employees have been hired back as subcontractors, with no long-term claim on the companies and no role in running them. When IBM laid off masses of workers in the 1990s, for example, it hired back one in five as consultants. Other corporations were built from scratch on a subcontracting model. The clothing brand United Colors of Benetton has only 1,500 employees but uses 25,000 workers through subcontractors.
Shift from lifetime employment to reliance on outsourced labor; decline in unions
The shift from permanent to precarious jobs continues apace. Buttigieg’s work at McKinsey included an engagement for Blue Cross Blue Shield of Michigan, during a period when it considered cutting up to 1,000 jobs (or 10 percent of its workforce). And the gig economy is just a high-tech generalization of the sub-contractor model. Uber is a more extreme Benetton; it deprives drivers of any role in planning and coordination, and it has literally no corporate hierarchy through which drivers can rise up to join management.
In effect, management consulting is a tool that allows corporations to replace lifetime employees with short-term, part-time, and even subcontracted workers, hired under ever more tightly controlled arrangements, who sell particular skills and even specified outputs, and who manage nothing at all.
the managerial control stripped from middle managers and production workers has been concentrated in a narrow cadre of executives who monopolize planning and coordination. Mid-century, democratic management empowered ordinary workers and disempowered elite executives, so that a bad CEO could do little to harm a company and a good one little to help it.
Whereas at mid-century a typical large-company CEO made 20 times a production worker’s income, today’s CEOs make nearly 300 times as much. In a recent year, the five highest-paid employees of the S&P 1500 (7,500 elite executives overall), obtained income equal to about 10 percent of the total profits of the entire S&P 1500.
as Kiechel put it dryly, “we are not all in this together; some pigs are smarter than other pigs and deserve more money.” Consultants seek, in this way, to legitimate both the job cuts and the explosion of elite pay. Properly understood, the corporate reorganizations were, then, not merely technocratic but ideological.
corporate reorganizations have deprived companies of an internal supply of managerial workers. When restructurings eradicated workplace training and purged the middle rungs of the corporate ladder, they also forced companies to look beyond their walls for managerial talent—to elite colleges, business schools, and (of course) to management-consulting firms. That is to say: The administrative techniques that management consultants invented created a huge demand for precisely the services that the consultants supply.
Consulting, like law school, is an all-purpose status giver—“low in risk and high in reward,” according to the Harvard Crimson. McKinsey also hopes that its meritocratic excellence will legitimate its activities in the eyes of the broader world. Management consulting, Kiechel observed, acquired its power and authority not from “silver-haired industry experience but rather from the brilliance of its ideas and the obvious candlepower of the people explaining them, even if those people were twenty-eight years old.”
A deeper objection to Buttigieg’s association with McKinsey concerns not whom the firm represents but the central role the consulting revolution has played in fueling the enormous economic inequalities that now threaten to turn the United States into a caste society.
Meritocrats like Buttigieg changed not just corporate strategies but also corporate values.
GM may aspire to build good cars; IBM, to make typewriters, computers, and other business machines; and AT&T, to improve communications. Executives who rose up through these companies, on the mid-century model, were embedded in their firms and embraced these values, so that they might even have come to view profits as a salutary side effect of running their businesses well.
When management consulting untethered executives from particular industries or firms and tied them instead to management in general, it also led them to embrace the one thing common to all corporations: making money for shareholders. Executives raised on the new, untethered model of management aim exclusively and directly at profit: their education, their career arc, and their professional role conspire to isolate them from other workers and train them single-mindedly on the bottom line.
American democracy, the left believes, cannot be rejuvenated by persuading elites to deploy their excessive power somehow more benevolently. Instead, it requires breaking the stranglehold that elites have on our economics and politics, and reempowering everyone else.
·archive.is·
How McKinsey Destroyed the Middle Class - The Atlantic
From Tech Critique to Ways of Living — The New Atlantis
From Tech Critique to Ways of Living — The New Atlantis
Yuk Hui's concept of "cosmotechnics" combines technology with morality and cosmology. Inspired by Daoism, it envisions a world where advanced tech exists but cultures favor simpler, purposeful tools that guide people towards contentment by focusing on local, relational, and ironic elements. A Daoist cosmotechnics points to alternative practices and priorities - learning how to live from nature rather than treating it as a resource to be exploited, valuing embodied relation over abstract information
We might think of the shifting relationship of human beings to the natural world in the terms offered by German sociologist Gerd-Günter Voß, who has traced our movement through three different models of the “conduct of life.”
The first, and for much of human history the only conduct of life, is what he calls the traditional. Your actions within the traditional conduct of life proceed from social and familial circumstances, from what is thus handed down to you. In such a world it is reasonable for family names to be associated with trades, trades that will be passed down from father to son: Smith, Carpenter, Miller.
But the rise of the various forces that we call “modernity” led to the emergence of the strategic conduct of life: a life with a plan, with certain goals — to get into law school, to become a cosmetologist, to get a corner office.
thanks largely to totalizing technology’s formation of a world in which, to borrow a phrase from Marx and Engels, “all that is solid melts into air,” the strategic model of conduct is replaced by the situational. Instead of being systematic planners, we become agile improvisers: If the job market is bad for your college major, you turn a side hustle into a business. But because you know that your business may get disrupted by the tech industry, you don’t bother thinking long-term; your current gig might disappear at any time, but another will surely present itself, which you will assess upon its arrival.
The movement through these three forms of conduct, whatever benefits it might have, makes our relations with nature increasingly instrumental. We can see this shift more clearly when looking at our changing experience of time
Within the traditional conduct of life, it is necessary to take stewardly care of the resources required for the exercise of a craft or a profession, as these get passed on from generation to generation.
But in the progression from the traditional to the strategic to the situational conduct of life, continuity of preservation becomes less valuable than immediacy of appropriation: We need more lithium today, and merely hope to find greater reserves — or a suitable replacement — tomorrow. This revaluation has the effect of shifting the place of the natural order from something intrinsic to our practices to something extrinsic. The whole of nature becomes what economists tellingly call an externality.
The basic argument of the SCT goes like this. We live in a technopoly, a society in which powerful technologies come to dominate the people they are supposed to serve, and reshape us in their image. These technologies, therefore, might be called prescriptive (to use Franklin’s term) or manipulatory (to use Illich’s). For example, social networks promise to forge connections — but they also encourage mob rule.
all things increasingly present themselves to us as technological: we see them and treat them as what Heidegger calls a “standing reserve,” supplies in a storeroom, as it were, pieces of inventory to be ordered and conscripted, assembled and disassembled, set up and set aside
In his exceptionally ambitious book The Question Concerning Technology in China (2016) and in a series of related essays and interviews, Hui argues, as the title of his book suggests, that we go wrong when we assume that there is one question concerning technology, the question, that is universal in scope and uniform in shape. Perhaps the questions are different in Hong Kong than in the Black Forest. Similarly, the distinction Heidegger draws between ancient and modern technology — where with modern technology everything becomes a mere resource — may not universally hold.
Thesis: Technology is an anthropological universal, understood as an exteriorization of memory and the liberation of organs, as some anthropologists and philosophers of technology have formulated it; Antithesis: Technology is not anthropologically universal; it is enabled and constrained by particular cosmologies, which go beyond mere functionality or utility. Therefore, there is no one single technology, but rather multiple cosmotechnics.
osmotechnics is the integration of a culture's worldview and ethical framework with its technological practices, illustrating that technology is not just about functionality but also embodies a way of life realized through making.
I think Hui’s cosmotechnics, generously leavened with the ironic humor intrinsic to Daoism, provides a genuine Way — pun intended — beyond the limitations of the Standard Critique of Technology. I say this even though I am not a Daoist; I am, rather, a Christian. But it should be noted that Daoism is both daojiao, an organized religion, and daojia, a philosophical tradition. It is daojia that Hui advocates, which makes the wisdom of Daoism accessible and attractive to a Christian like me. Indeed, I believe that elements of daojia are profoundly consonant with Christianity, and yet underdeveloped in the Christian tradition, except in certain modes of Franciscan spirituality, for reasons too complex to get into here.
this technological Daoism as an embodiment of daojia, is accessible to people of any religious tradition or none. It provides a comprehensive and positive account of the world and one’s place in it that makes a different approach to technology more plausible and compelling. The SCT tends only to gesture in the direction of a model of human flourishing, evokes it mainly by implication, whereas Yuk Hui’s Daoist model gives an explicit and quite beautiful account.
The application of Daoist principles is most obvious, as the above exposition suggests, for “users” who would like to graduate to the status of “non-users”: those who quietly turn their attention to more holistic and convivial technologies, or who simply sit or walk contemplatively. But in the interview I quoted from earlier, Hui says, “Some have quipped that what I am speaking about is Daoist robots or organic AI” — and this needs to be more than a quip. Peter Thiel’s longstanding attempt to make everyone a disciple of René Girard is a dead end. What we need is a Daoist culture of coders, and people devoted to “action without acting” making decisions about lithium mining.
Tools that do not contribute to the Way will neither be worshipped nor despised. They will simply be left to gather dust as the people choose the tools that will guide them in the path of contentment and joy: utensils to cook food, devices to make clothes. Of course, the food of one village will differ from that of another, as will the clothing. Those who follow the Way will dwell among the “ten thousand things” of this world — what we call nature — in a certain manner that cannot be specified legally: Verse 18 of the Tao says that when virtue arises only from rules, that is a sure sign that the Way is not present and active. A cosmotechnics is a living thing, always local in the specifics of its emergence in ways that cannot be specified in advance.
It is from the ten thousand things that we learn how to live among the ten thousand things; and our choice of tools will be guided by what we have learned from that prior and foundational set of relations. This is cosmotechnics.
Multiplicity avoids the universalizing, totalizing character of technopoly. The adherents of technopoly, Hui writes, “wishfully believ[e] that the world process will stamp out differences and diversities” and thereby achieve a kind of techno-secular “theodicy,” a justification of the ways of technopoly to its human subjects. But the idea of multiple cosmotechnics is also necessary, Hui believes, in order to avoid the simply delusional attempt to find “a way out of modernity” by focusing on the indigenous or biological “Other.” An aggressive hostility to modernity and a fetishizing of pre-modernity is not the Daoist way.
“I believe that to overcome modernity without falling back into war and fascism, it is necessary to reappropriate modern technology through the renewed framework of a cosmotechnics.” His project “doesn’t refuse modern technology, but rather looks into the possibility of different technological futures.”
“Thinking rooted in the earthy virtue of place is the motor of cosmotechnics. However, for me, this discourse on locality doesn’t mean a refusal of change and of progress, or any kind of homecoming or return to traditionalism; rather, it aims at a re-appropriation of technology from the perspective of the local and a new understanding of history.”
Always Coming Home illustrates cosmotechnics in a hundred ways. Consider, for instance, information storage and retrieval. At one point we meet the archivist of the Library of the Madrone Lodge in the village of Wakwaha-na. A visitor from our world is horrified to learn that while the library gives certain texts and recordings to the City of Mind, some of their documents they simply destroy. “But that’s the point of information storage and retrieval systems! The material is kept for anyone who wants or needs it. Information is passed on — the central act of human culture.” But that is not how the librarian thinks about it. “Tangible or intangible, either you keep a thing or you give it. We find it safer to give it” — to practice “unhoarding.”
It is not information, but relation. This too is cosmotechnics.
The modern technological view treats information as a resource to be stored and optimized. But the archivist in Le Guin's Daoist-inspired society takes a different approach, one where documents can be freely discarded because what matters is not the hoarding of information but the living of life in sustainable relation
a cosmotechnics is the point at which a way of life is realized through making. The point may be illustrated with reference to an ancient tale Hui offers, about an excellent butcher who explains to a duke what he calls the Dao, or “way,” of butchering. The reason he is a good butcher, he says, it not his mastery of a skill, or his reliance on superior tools. He is a good butcher because he understands the Dao: Through experience he has come to rely on his intuition to thrust the knife precisely where it does not cut through tendons or bones, and so his knife always stays sharp. The duke replies: “Now I know how to live.” Hui explains that “it is thus the question of ‘living,’ rather than that of technics, that is at the center of the story.”
·thenewatlantis.com·
From Tech Critique to Ways of Living — The New Atlantis
Hate is the New Sex
Hate is the New Sex
These days hate has roughly the same role in popular culture that original sin has in traditional Christian theology. If you want to slap the worst imaginable label on an organization, you call it a hate group. If you want to push a category of discourse straight into the realm of the utterly unacceptable, you call it hate speech. If you’re speaking in public and you want to be sure that everyone in the crowd will beam approval at you, all you have to do is denounce hate.
At the far end of this sort of rhetoric, you get the meretricious slogan used by Hillary Clinton’s unsuccessful presidential campaign last year: LOVE TRUMPS HATE. I hope that none of my readers are under the illusion that Clinton’s partisans were primarily motivated by love, except in the sense of Clinton’s love for power and the Democrats’ love for the privileges and payouts they could expect from four more years of control of the White House; and of course Trump and the Republicans were head over heels in love with the same things. The fact that Clinton’s marketing flacks and focus groups thought that the slogan just quoted would have an impact on the election, though, shows just how pervasive the assumption I’m discussing has become in our culture.
what happens when people decide that some common human emotion is evil and harmful and wrong, and decide that the way to make a better world is to get rid of it?
The example I have in mind is the attitude, prevalent in the English-speaking world from the middle of the nineteenth century to the middle of the twentieth, that sex was the root of all evil.
I know that comparing current attitudes toward hate with Victorian attitudes toward sex will inspire instant pushback from a good many of my readers. After all, sexual desire is natural and normal and healthy, while hate is evil and harmful and wrong, right? Here again, it’s easy to lose track of the fact that people a century and a quarter ago—most likely including your ancestors, dear reader, if they happened to live in the English-speaking world—saw things the other way around. To them, hate was an ordinary emotion that most people had under certain circumstances, but sexual desire was beyond the pale: beastly, horrid, filthy, and so on through an impressive litany of unpleasant adjectives.
Make something forbidden and you make it desirable. Take a normal human emotional state, one that everyone experiences, and make it forbidden, and you guarantee that the desire to violate the taboo will take on overwhelming power. That’s why, after spending their days subject to the pervasive tone policing of contemporary life, in which every utterance gets scrutinized for the least trace of anything that anyone anywhere could conceivably interpret as hateful, so many people in today’s world don internet aliases and go to online forums where they can blurt out absolutely anything
The opposite of one bad idea, after all, is usually another bad idea; the fact that dying of thirst is bad for you doesn’t make drowning good for you; whether we’re talking about sex or anything else, there’s a space somewhere between “not enough” and “too much,” between pathological repression and equally pathological expression, that’s considerably healthier than either of the extremes. I’m going to risk causing my more sensitive readers to clutch their smelling salts and faint on the nearest sofa, in true Victorian style, by suggesting that the same thing’s true of hate.
Hate is like sex; there are certain times, places, and contexts where it’s appropriate, but there are many, many others where it’s not. You can recognize its place in life without having to act it out on every occasion—and in fact, the more conscious you are of its place in life, the more completely you acknowledge it and give it its due, the less likely you are to get blindsided by it. That’s true of sex, and it’s true of hate: what you refuse to acknowledge controls you; what you acknowledge, you can learn to control.
the blind faith that goodness requires amputation is so unquestioned in our time.
Human beings are never going to be perfect, not if perfection means the amputation of some part of human experience, whether the limb that’s being hacked off is our sexual instincts, our aggressive instincts, or any other part of who and what we are.
We can accept our sexuality, whatever that happens to be, and weave it into the pattern of our individual lives and our relationships with other people in ways that uphold the values we cherish and yield as much joy and as little unnecessary pain for as many people as possible. That doesn’t mean always acting out our desires—in some cases, it can mean never acting them out at all
·ecosophia.net·
Hate is the New Sex