Found 53 bookmarks
Newest
Why Storytelling by Tony Fadell
Why Storytelling by Tony Fadell
Steve didn’t just read a script for the presentation. He’d been telling a version of that same story every single day for months and months during development—to us, to his friends, his family. He was constantly working on it, refining it. Every time he’d get a puzzled look or a request for clarification from his unwitting early audience, he’d sand it down, tweak it slightly, until it was perfectly polished.
He talked for a while about regular mobile phones and smartphones and the problems of each before he dove into the features of the new iPhone. He used a technique I later came to call the virus of doubt. It’s a way to get into people’s heads, remind them about a daily frustration, get them annoyed about it all over again. If you can infect them with the virus of doubt—“maybe my experience isn’t as good as I thought, maybe it could be better”—then you prime them for your solution. You get them angry about how it works now so they can get excited about a new way of doing things.
when I say “story,” I don’t just mean words. Your product’s story is its design, its features, images and videos, quotes from customers, tips from reviewers, conversations with support agents. It’s the sum of what people see and feel about this thing that you’ve created.
When you get wrapped up in the “what,” you get ahead of people. You think everyone can see what you see. But they don’t. They haven’t been working on it for weeks, months, years. So you need to pause and clearly articulate the “why” before you can convince anyone to care about the “what.”
That’s the case no matter what you make—even if you sell B2B payments software. Even if you build deep-tech solutions for customers who don’t exist yet. Even if you sell lubricants to a factory that’s been buying the same thing for twenty years.
If your competitors are telling better stories than you, if they’re playing the game and you’re not, then it doesn’t matter if their product is worse. They will get the attention. To any customers, investors, partners, or talent doing a cursory search, they will appear to be the leaders in the category. The more people talk about them, the greater their mind share, and the more people will talk about them.
A good story is an act of empathy. It recognizes the needs of its audience. And it blends facts and feelings so the customer gets enough of both. First you need enough instincts and concrete information that your argument doesn’t feel too floaty and insubstantial. It doesn’t have to be definitive data, but there has to be enough to feel meaty, to convince people that you’re anchored in real facts. But you can overdo it—if your story is only informational, then it’s entirely possible that people will agree with you but decide it’s not compelling enough to act on just yet. Maybe next month. Maybe next year.
So you have to appeal to their emotions—connect with something they care about. Their worries, their fears. Or show them a compelling vision of the future: give a human example. Walk through how a real person will experience this product—their day, their family, their work, the change they’ll experience. Just don’t lean so far into the emotional connection that what you’re arguing for feels novel, but not necessary.
And always remember that your customers’ brains don’t always work like yours. Sometimes your rational argument will make an emotional connection. Sometimes your emotional story will give people the rational ammunition to buy your product. Certain Nest customers looked at the beautiful thermostat that we lovingly crafted to appeal to their heart and soul and said, “Sure, okay. It’s pretty” and then had a thrilled, emotional reaction to the potential of saving twenty-three dollars on their energy bill.
everyone will read your story differently. That’s why analogies can be such a useful tool in storytelling. They create a shorthand for complicated concepts—a bridge directly to a common experience.
That’s another thing I learned from Steve Jobs. He’d always say that analogies give customers superpowers. A great analogy allows a customer to instantly grasp a difficult feature and then describe that feature to others. That’s why “1,000 songs in your pocket” was so powerful. Everyone had CDs and tapes in bulky players that only let you listen to 10-15 songs, one album at a time. So “1,000 songs in your pocket” was an incredible contrast—it let people visualize this intangible thing—all the music they loved all together in one place, easy to find, easy to hold—and gave them a way to tell their friends and family why this new iPod thing was so cool.
Because to truly understand many of the features of our products, you’d need a deep well of knowledge about HVAC systems and power grids and the way smoke refracts through a laser to detect fire—knowledge almost nobody had. So we cheated. We didn’t try to explain everything. We just used an analogy. I remember there was one complex feature that was designed to lighten the load on power plants on the hottest or coldest days of the year when everyone cranked up the heat or AC at once. It usually came down to just a few hours in the afternoon, a few days a year—one or more coal power plants would be brought on line to avoid blackouts. So we designed a feature that predicted when these moments would come, then the Nest Thermostat would crank the AC or heat up extra before the crucial peak hours and turn it down when everyone else was turning it up. Anyone who signed up for the program got a credit on their energy bill. As more and more people joined the program, the result was a win-win—people stayed comfortable, they saved money, and the energy companies didn’t have to turn on their dirtiest plants. And that is all well and good, but it just took me 150 word to explain. So after countless hours of thinking about it and trying all the possible solutions, we settled on doing it in three: Rush Hour Rewards.
Everyone understands the concept of rush hour—the moment when way too many people get on the road together and traffic slows to a creep. Same thing happens with energy. We didn’t need to explain much more than that—rush hours are a problem, but when there’s an energy rush hour, you can get something out of it. You can get a reward. You can actually save money rather than getting stuck with everyone else.
Quick stories are easy to remember. And, more importantly, easy to repeat. Someone else telling your story will always reach more people and do more to convince them to buy your product than any amount of talking you do about yourself on your own platforms. You should always be striving to tell a story so good that it stops being yours—so your customer learns it, loves it, internalizes it, owns it. And tells it to everyone they know.
A good product story has three elements: It appeals to people’s rational and emotional sides. It takes complicated concepts and makes them simple. It reminds people of the problem that’s being solved—it focuses on the “why.”
·founderstribune.org·
Why Storytelling by Tony Fadell
Just How Queer Is Luca Guadagnino’s Queer Anyway?
Just How Queer Is Luca Guadagnino’s Queer Anyway?
Guadagnino reminded me that as we come of age, we decide for ourselves what informs us, and spoke to the first time he read Burroughs. “You enter into the language of Burroughs and you understand, at 17 years old, that there are ways we can express ourselves that are so wide, sophisticated, complicated, and that you never have to adapt to a logic that is preordained.”
Burroughs in fact traveled there in 1952; The Yage Letters chronicles his experiments in his letters to Ginsberg. He was obsessed with the idea that yage could enhance telepathy. In the hallucinatory new scenes, the connection between Lee and Allerton goes to places the earthbound book could never take it.
When the screenplay is his own, firmly in Guadagnino’s hands, it’s actually fabulous — and a relief after the earlier conflict between the director and his material. At the same time, it makes no sense. That’s the most Burroughsian nod in this film: the sheer randomness and trippy outrageousness of the end. It’s very Naked Lunch — both the book and David Cronenberg’s 1991 film inspired by Burroughs, which was clearly on Guadagnino’s mind.
It’s paying more of a tribute to an adaptation of a different Burroughs book, a film that feels genuinely Burroughsian but has less of a basis in the underlying text than his own. Something is off, the essential is missing, and this may be why I didn’t feel Burroughs’s spirit.
still, I wept through scenes of Guadagnino’s film — including a hallucinatory reference to Joan’s death in which Lee does the same failed William Tell routine with Allerton — but it wasn’t for Joan or Burroughs; it was for James’s lover Michael Emerton, who  killed himself with a   gun. I wept as this beautifully designed movie, with gorgeous men in well-cut suits, gave me time to think about the karmic connections that both blessed and cursed me. I wept for Billy Jr., whose mother Burroughs had killed. Then I wept for Burroughs, and I wept for Joan.
I wept for the portrayal of transactional sex that was the “romance” the director referred to. I wept as I questioned notions of intent and integrity in transactional relationships: mine with younger, troubled men who lived on the fringes of gay culture; Burroughs’s with James; and James’s with me. Those relationships, for better or worse, follow the karmic path laid down for me 40-plus years ago. That karma, at least for me, as I flew through the past making sense of it, was neutralized by the acceptance of its very existence, its painful impact on me and those affected by it, and, finally, by releasing it. That was Guadagnino’s gift to me.
Most poignantly, I wept for James, who lives alone, unable to walk, with a brain injury that was inflicted during a gay bashing and made worse by his falls at home and sustaining further concussions. But there has been some nice news for him, as a double LP of his work as a singer-songwriter is being released on Lotuspool Records. And he told me he liked Guadagnino’s Queer — though he quibbled with the casting and look of Allerton — and that’s even better news. Guadagnino liked hearing that
On the Zoom with Guadagnino and Anderson, I wanted to ask about legacy. Are there responsibilities we who make art or work in the arts have to our elders, to the radical spirits who pushed open the doors? I mentioned the affluent gay men, usually heteronormatively married, who “rent a womb” and maybe buy an egg to drop in it so their children have their genes — all of which seems to me to be the furthest thing from queer. In response, some signifiers were mentioned. Anderson speaks to the look of the film, citing George Platt Lynes’s influence; they both chimed in about Powell and Pressburger (the Archers), of The Red Shoes; I mentioned Rainer Werner Fassbinder’s adaptation of Jean Genet’s Querelle, which Guadagnino said, indeed, influenced him. The point has been missed, and the clock is ticking. I move on, disappointed.
Will this film ignite a radical spark in younger viewers — be they queer or not? That’s what Burroughs did for me and for many, many of his readers
The craftsmanship of the film is sterling on many levels. But it is not the book I know by the writer I knew so well. It is stylish in the modality of fashion — having a “look”; it is beautiful in its entirety as a complete visual construction. It is, essentially, a gay location film. It is romantic, something of a travelogue — you might want to go where it is set, eat at the restaurants, while wearing the clothing, certainly in the company of some of the flawless boys cast. But it is not the world that the book conjures for most readers, certainly not me. This is the work of the director — as any film should be.
Still, a bad match of director and material renders confusion at best, emptiness at worst; I worried that this film could potentially misconstrue the importance of Burroughs’s role as a visionary queer writer for future generations. I was incapable of explaining this to Guadagnino and Anderson, in our 20-minute Zoom, not to mention it might have stopped the interview. But I tried.
It wasn’t just the peculiar casting of a beefy daddy like Daniel Craig as the Burroughs character, William Lee, or pretty Drew Starkey as the aloof, younger love interest, Eugene Allerton, who spends the film looking great in fabulous knitwear by Jonathan Anderson, Guadagnino’s friend and the film’s costume designer, but nothing like the image of the character I had in my head.
·vulture.com·
Just How Queer Is Luca Guadagnino’s Queer Anyway?
The Fury
The Fury
Tracking Esther down at an after-hours club and marvelling at her artistry, he resolves to propel her into pictures. The number she performs at the club, “The Man That Got Away,” is one of the most astonishing, emotionally draining musical productions in Hollywood history, both for Garland’s electric, spontaneous performance and for Cukor’s realization of it. The song itself, by Harold Arlen and Ira Gershwin, is the apotheosis of the torch song, and Garland kicks its drama up to frenzied intensity early on, as much with the searing pathos of her voice as with convulsive, angular gestures that look like an Expressionist painting come to life. (Her fury prefigures the psychodramatic forces unleashed by Gena Rowlands in the films of her husband, John Cassavetes.) Cukor, who had first worked wonders with Garland in the early days of “The Wizard of Oz” (among other things, he removed her makeup, a gesture repeated here by Maine), captures her performance in a single, exquisitely choreographed shot, with the camera dollying back to reveal the band, in shadow, with spotlights gleaming off the bells of brass instruments and the chrome keys of woodwinds.
·newyorker.com·
The Fury
Bernie Would Have Won
Bernie Would Have Won

AI summary: This article argues that Trump's 2024 victory represents the triumph of right-wing populism over neoliberalism, enabled by Democratic Party leadership's deliberate suppression of Bernie Sanders' left-wing populist movement. The piece contends that by rejecting class-focused politics in favor of identity politics and neoliberal policies, Democrats created a vacuum that Trump's authoritarian populism filled.

Here’s a warning and an admonition written in January 2019 by author and organizer Jonathan Smucker: “If the Dem Party establishment succeeds in beating down the fresh leadership and bold vision that's stepping up, it will effectively enable the continued rise of authoritarianism. But they will not wake up and suddenly grasp this. It's on us to outmaneuver them and win.”
There are a million surface-level reasons for Kamala Harris’s loss and systematic underperformance in pretty much every county and among nearly every demographic group. She is part of a deeply unpopular administration. Voters believe the economy is bad and that the country is on the wrong track. She is a woman and we still have some work to do as a nation to overcome long-held biases.  But the real problems for the Democrats go much deeper and require a dramatic course correction of a sort that, I suspect, Democrats are unlikely to embark upon. The bottom line is this: Democrats are still trying to run a neoliberal campaign in a post-neoliberal era. In other words, 2016 Bernie was right.
The lie that fueled the Iraq war destroyed confidence in the institutions that were the bedrock of this neoliberal order and in the idea that the U.S. could or should remake the world in our image. Even more devastating, the financial crisis left home owners destitute while banks were bailed out, revealing that there was something deeply unjust in a system that placed capital over people.
These events sparked social movements on both the right and the left. The Tea Party churned out populist-sounding politicians like Sarah Palin and birtherist conspiracies about Barack Obama, paving the way for the rise of Donald Trump. The Tea Party and Trumpism are not identical, of course, but they share a cast of villains: The corrupt bureaucrats or deep state. The immigrants supposedly changing your community. The cultural elites telling you your beliefs are toxic. Trump’s version of this program is also explicitly authoritarian. This authoritarianism is a feature not a bug for some portion of the Trump coalition which has been persuaded that democracy left to its own devices could pose an existential threat to their way of life.
On the left, the organic response to the financial crisis was Occupy Wall Street, which directly fueled the Bernie Sanders movement. Here, too, the villains were clear. In the language of Occupy it was the 1% or as Bernie put it the millionaires and billionaires. It was the economic elite and unfettered capitalism that had made it so hard to get by. Turning homes into assets of financial speculation. Wildly profiteering off of every element of our healthcare system. Busting unions so that working people had no collective power. This movement was, in contrast to the right, was explicitly pro-democracy, with a foundational view that in a contest between the 99% and the 1%, the 99% would prevail. And that a win would lead to universal programs like Medicare for All, free college, workplace democracy, and a significant hike in the minimum wage.
On the Republican side, Donald Trump emerged as a political juggernaut at a time when the party was devastated and rudderless, having lost to Obama twice in a row. This weakened state—and the fact that the Trump alternatives were uncharismatic drips like Jeb Bush—created a path for Trump to successfully execute a hostile takeover of the party.
Plus, right-wing populism embraces capital, and so it posed no real threat to the monied interests that are so influential within the party structures.
The Republican donor class was not thrilled with Trump’s chaos and lack of decorum but they did not view him as an existential threat to their class interests
The difference was that Bernie’s party takeover did pose an existential threat—both to party elites who he openly antagonized and to the party’s big money backers. The bottom line of the Wall Street financiers and corporate titans was explicitly threatened. His rise would simply not be allowed. Not in 2016 and not in 2020.
What’s more, Hillary Clinton and her allies launched a propaganda campaign to posture as if they were actually to the left of Bernie by labeling him and his supporters sexist and racist for centering class politics over identity politics. This in turn spawned a hell cycle of woke word-policing and demographic slicing and dicing and antagonism towards working class whites that only made the Democratic party more repugnant to basically everyone.
The path not taken in 2016 looms larger than ever. Bernie’s coalition was filled with the exact type of voters who are now flocking to Donald Trump: Working class voters of all races, young people, and, critically, the much-derided bros. The top contributors to Bernie’s campaign often held jobs at places like Amazon and Walmart. The unions loved him. And—never forget—he earned the coveted Joe Rogan endorsement that Trump also received the day before the election this year. It turns out, the Bernie-to-Trump pipeline is real! While that has always been used as an epithet to smear Bernie and his movement, with the implication that social democracy is just a cover for or gateway drug to right wing authoritarianism, the truth is that this pipeline speaks to the power and appeal of Bernie’s vision as an effective antidote to Trumpism. When these voters had a choice between Trump and Bernie, they chose Bernie. For many of them now that the choice is between Trump and the dried out husk of neoliberalism, they’re going Trump.
Maybe I will be just as wrong as I was about the election but it is my sense that with this Trump victory, authoritarian right politics have won the ideological battle for what will replace the neoliberal order in America. And yes, I think it will be ugly, mean, and harmful—because it already is.
·dropsitenews.com·
Bernie Would Have Won
Shop Class as Soulcraft
Shop Class as Soulcraft

Summary: Skilled manual labor entails a systematic encounter with the material world that can enrich one's intellectual and spiritual life. The degradation of work in both blue-collar and white-collar professions is driven not just by technological progress, but by the separation of thinking from doing according to the dictates of capital. To realize the full potential of human flourishing, we must reckon with the appeal of skilled manual work and question the assumptions that shape our educational priorities and notions of a good life.

an engineering culture has developed in recent years in which the object is to “hide the works,” rendering the artifacts we use unintelligible to direct inspection. Lift the hood on some cars now (especially German ones), and the engine appears a bit like the shimmering, featureless obelisk that so enthralled the cavemen in the opening scene of the movie 2001: A Space Odyssey. Essentially, there is another hood under the hood.
What ordinary people once made, they buy; and what they once fixed for themselves, they replace entirely or hire an expert to repair, whose expert fix often involves installing a pre-made replacement part.
So perhaps the time is ripe for reconsideration of an ideal that has fallen out of favor: manual competence, and the stance it entails toward the built, material world. Neither as workers nor as consumers are we much called upon to exercise such competence, most of us anyway, and merely to recommend its cultivation is to risk the scorn of those who take themselves to be the most hard-headed: the hard-headed economist will point out the opportunity costs of making what can be bought, and the hard-headed educator will say that it is irresponsible to educate the young for the trades, which are somehow identified as the jobs of the past.
It was an experience of agency and competence. The effects of my work were visible for all to see, so my competence was real for others as well; it had a social currency. The well-founded pride of the tradesman is far from the gratuitous “self-esteem” that educators would impart to students, as though by magic.
Skilled manual labor entails a systematic encounter with the material world, precisely the kind of encounter that gives rise to natural science. From its earliest practice, craft knowledge has entailed knowledge of the “ways” of one’s materials — that is, knowledge of their nature, acquired through disciplined perception and a systematic approach to problems.
Because craftsmanship refers to objective standards that do not issue from the self and its desires, it poses a challenge to the ethic of consumerism, as the sociologist Richard Sennett has recently argued. The craftsman is proud of what he has made, and cherishes it, while the consumer discards things that are perfectly serviceable in his restless pursuit of the new.
The central culprit in Braverman’s account is “scientific management,” which “enters the workplace not as the representative of science, but as the representative of management masquerading in the trappings of science.” The tenets of scientific management were given their first and frankest articulation by Frederick Winslow Taylor
Scattered craft knowledge is concentrated in the hands of the employer, then doled out again to workers in the form of minute instructions needed to perform some part of what is now a work process. This process replaces what was previously an integral activity, rooted in craft tradition and experience, animated by the worker’s own mental image of, and intention toward, the finished product. Thus, according to Taylor, “All possible brain work should be removed from the shop and centered in the planning or lay-out department.” It is a mistake to suppose that the primary purpose of this partition is to render the work process more efficient. It may or may not result in extracting more value from a given unit of labor time. The concern is rather with labor cost. Once the cognitive aspects of the job are located in a separate management class, or better yet in a process that, once designed, requires no ongoing judgment or deliberation, skilled workers can be replaced with unskilled workers at a lower rate of pay.
the “jobs of the future” rhetoric surrounding the eagerness to end shop class and get every warm body into college, thence into a cubicle, implicitly assumes that we are heading to a “post-industrial” economy in which everyone will deal only in abstractions. Yet trafficking in abstractions is not the same as thinking. White collar professions, too, are subject to routinization and degradation, proceeding by the same process as befell manual fabrication a hundred years ago: the cognitive elements of the job are appropriated from professionals, instantiated in a system or process, and then handed back to a new class of workers — clerks — who replace the professionals. If genuine knowledge work is not growing but actually shrinking, because it is coming to be concentrated in an ever-smaller elite, this has implications for the vocational advice that students ought to receive.
The trades are then a natural home for anyone who would live by his own powers, free not only of deadening abstraction, but also of the insidious hopes and rising insecurities that seem to be endemic in our current economic life. This is the stoic ideal.
·thenewatlantis.com·
Shop Class as Soulcraft
Berger’s Books
Berger’s Books
The cover immediately sets Ways of Seeing apart from its contemporaries, the book itself begins on the cover. Rather than creating a conventionally appealing cover, Hollis chose to bypass this tradition entirely, instead placing the text and an image from the start of the first chapter straight onto the front, just beneath the title and authors name. This directness has a link with the television series, mimicking how the first episode began with no preamble or title sequence, Berger got started immediately, drawing the audience in with his message rather than any distractions.
Another link to Berger’s presenting style is Hollis’ choice of typeface, bold Univers 65 is used for the body copy throughout, in an attempt to achieve something of the captivating quality of Berger’s voice.
The layout also employs large indents rather than paragraph breaks, something of a Hollis trademark. But this mirrors how Berger had presented on television, there was little time wasted with atmospheric filler shots or long gaps in speech, the message was key and continuous.
The key reason that Ways of Seeing has become iconic as a piece of book design is how it dealt with text and image: the two are integrated, where an image is mentioned in the text it also appears there. Captions are avoided where possible. When unavoidable they are in a lighter weight of type and run horizontally, so as not to disrupt the text. Images are often set at the same width as the lines of text, or indented by the same amount, this democratises the text and image relationship. Occasionally works of art are cropped to show only the pertinent details. All of these features are a big departure from the art books of the time which usually featured glorified full page colour images, often in a glossy ‘colour plate’ section in the middle, completely distanced from where the text refers to them.
Design is not used for prettifying, or to create appeal, rather it is used for elucidating, to spread his message or get his point across as clearly as possible. Be it a point about art and politics, art and gender, the ethics of advertising, the human experiences of a rural GP, or economic migrants in Germany — the design is always appropriate to what Berger wants to say, but does so economically without redundancy.
Even in Portraits: John Berger on Artists published by Verso in 2015, Berger insisted on black and white reproductions, arguing that: “glossy colour reproductions in the consumerist world of today tend to reduce what they show to items in a luxury brochure for millionaires. Whereas black-and-white reproductions are simple memoranda.”
the images in the book “illustrate the essentially dialectical relationship between text and image in Berger’s work: the pattern in which an image shapes a text, which then goes on to shape how we understand that image.”
·theo-inglis.medium.com·
Berger’s Books
My Last Five Years of Work
My Last Five Years of Work
Copywriting, tax preparation, customer service, and many other tasks are or will soon be heavily automated. I can see the beginnings in areas like software development and contract law. Generally, tasks that involve reading, analyzing, and synthesizing information, and then generating content based on it, seem ripe for replacement by language models.
Anyone who makes a living through  delicate and varied movements guided by situation specific know-how can expect to work for much longer than five more years. Thus, electricians, gardeners, plumbers, jewelry makers, hair stylists, as well as those who repair ironwork or make stained glass might find their handiwork contributing to our society for many more years to come
Finally, I expect there to be jobs where humans are preferred to AIs even if the AIs can do the job equally well, or perhaps even if they can do it better. This will apply to jobs where something is gained from the very fact that a human is doing it—likely because it involves the consumer feeling like they have a relationship with the human worker as a human. Jobs that might fall into this category include counselors, doulas, caretakers for the elderly, babysitters, preschool teachers, priests and religious leaders, even sex workers—much has been made of AI girlfriends, but I still expect that a large percentage of buyers of in-person sexual services will have a strong preference for humans. Some have called these jobs “nostalgic jobs.”
It does seem that, overall, unemployment makes people sadder, sicker, and more anxious. But it isn’t clear if this is an inherent fact of unemployment, or a contingent one. It is difficult to isolate the pure psychological effects of being unemployed, because at present these are confounded with the financial effects—if you lose your job, you have less money—which produce stress that would not exist in the context of, say, universal basic income. It is also confounded with the “shame” aspect of being fired or laid off—of not working when you really feel you should be working—as opposed to the context where essentially all workers have been displaced.
One study that gets around the “shame” confounder of unemployment is “A Forced Vacation? The Stress of Being Temporarily Laid Off During a Pandemic” by Scott Schieman, Quan Mai, and Ryu Won Kang. This study looked at Canadian workers who were temporarily laid off several months into the COVID-19 pandemic. They first assumed that such a disruption would increase psychological distress, but instead found that the self-reported wellbeing was more in line with the “forced vacation hypothesis,” suggesting that temporarily laid-off workers might initially experience lower distress due to the unique circumstances of the pandemic.
By May 2020, the distress gap observed in April had vanished, indicating that being temporarily laid off was not associated with higher distress during these months. The interviews revealed that many workers viewed being left without work as a “forced vacation,” appreciating the break from work-related stress and valuing the time for self-care and family. The widespread nature of layoffs normalized the experience, reducing personal blame and fostering a sense of shared experience. Financial strain was mitigated by government support, personal savings, and reduced spending, which buffered against potential distress.
The study suggests that the context and available support systems can significantly alter the psychological outcomes of unemployment—which seems promising for AGI-induced unemployment.
From the studies on plant closures and pandemic layoffs, it seems that shame plays a role in making people unhappy after unemployment, which implies that they might be happier in full automation-induced unemployment, since it would be near-universal and not signify any personal failing.
A final piece that reveals a societal-psychological aspect to how much work is deemed necessary is that the amount has changed over time! The number of hours that people have worked has declined over the past 150 years. Work hours tend to decline as a country gets richer. It seems odd to assume that the current accepted amount of work of roughly 40 hours a week is the optimal amount. The 8-hour work day, weekends, time off—hard-fought and won by the labor movement!—seem to have been triumphs for human health and well-being. Why should we assume that stopping here is right? Why should we assume that less work was better in the past, but less work now would be worse?
Removing the shame that accompanies unemployment by removing the sense that one ought to be working seems one way to make people happier during unemployment. Another is what they do with their free time. Regardless of how one enters unemployment, one still confronts empty and often unstructured time.
One paper, titled “Having Too Little or Too Much Time Is Linked to Lower Subjective Well-Being” by Marissa A. Sharif, Cassie Mogilner, and Hal E. Hershfield tried to explore whether it was possible to have “too much” leisure time.
The paper concluded that it is possible to have too little discretionary time, but also possible to have too much, and that moderate amounts of discretionary time seemed best for subjective well-being. More time could be better, or at least not meaningfully worse, provided it was spent on “social” or “productive” leisure activities. This suggests that how people fare psychologically with their post-AGI unemployment will depend heavily on how they use their time, not how much of it there is
Automation-induced unemployment could feel like retiring depending on how total it is. If essentially no one is working, and no one feels like they should be working, it might be more akin to retirement, in that it would lack the shameful element of feeling set apart from one’s peers.
Women provide another view on whether formal work is good for happiness. Women are, for the most part, relatively recent entrants to the formal labor market. In the U.S., 18% of women were in the formal labor force in 1890. In 2016, 57% were. Has labor force participation made them happier? By some accounts: no. A paper that looked at subjective well-being for U.S. women from the General Social Survey between the 1970s and 2000s—a time when labor force participation was climbing—found both relative and absolute declines in female happiness.
I think women’s work and AI is a relatively optimistic story. Women have been able to automate unpleasant tasks via technological advances, while the more meaningful aspects of their work seem less likely to be automated away.  When not participating in the formal labor market, women overwhelmingly fill their time with childcare and housework. The time needed to do housework has declined over time due to tools like washing machines, dryers, and dishwashers. These tools might serve as early analogous examples of the future effects of AI: reducing unwanted and burdensome work to free up time for other tasks deemed more necessary or enjoyable.
it seems less likely that AIs will so thoroughly automate childcare and child-rearing because this “work” is so much more about the relationship between the parties involved. Like therapy, childcare and teaching seems likely to be one of the forms of work where a preference for a human worker will persist the longest.
In the early modern era, landed gentry and similar were essentially unemployed. Perhaps they did some minor administration of their tenants, some dabbled in politics or were dragged into military projects, but compared to most formal workers they seem to have worked relatively few hours. They filled the remainder of their time with intricate social rituals like balls and parties, hobbies like hunting, studying literature, and philosophy, producing and consuming art, writing letters, and spending time with friends and family. We don’t have much real well-being survey data from this group, but, hedonically, they seem to have been fine. Perhaps they suffered from some ennui, but if we were informed that the great mass of humanity was going to enter their position, I don’t think people would be particularly worried.
I sometimes wonder if there is some implicit classism in people’s worries about unemployment: the rich will know how to use their time well, but the poor will need to be kept busy.
Although a trained therapist might be able to counsel my friends or family through their troubles better, I still do it, because there is value in me being the one to do so. We can think of this as the relational reason for doing something others can do better. I write because sometimes I enjoy it, and sometimes I think it betters me. I know others do so better, but I don’t care—at least not all the time. The reasons for this are part hedonic and part virtue or morality.  A renowned AI researcher once told me that he is practicing for post-AGI by taking up activities that he is not particularly good at: jiu-jitsu, surfing, and so on, and savoring the doing even without excellence. This is how we can prepare for our future where we will have to do things from joy rather than need, where we will no longer be the best at them, but will still have to choose how to fill our days.
·palladiummag.com·
My Last Five Years of Work
The Life and Death of Hollywood, by Daniel Bessner
The Life and Death of Hollywood, by Daniel Bessner
now the streaming gold rush—the era that made Dickinson—is over. In the spring of 2022, the Federal Reserve began raising interest rates after years of nearly free credit, and at roughly the same time, Wall Street began calling in the streamers’ bets. The stock prices of nearly all the major companies with streaming platforms took precipitous falls, and none have rebounded to their prior valuation.
Thanks to decades of deregulation and a gush of speculative cash that first hit the industry in the late Aughts, while prestige TV was climbing the rungs of the culture, massive entertainment and media corporations had been swallowing what few smaller companies remained, and financial firms had been infiltrating the business, moving to reduce risk and maximize efficiency at all costs, exhausting writers in evermore unstable conditions.
The new effective bosses of the industry—colossal conglomerates, asset-management companies, and private-equity firms—had not been simply pushing workers too hard and grabbing more than their fair share of the profits. They had been stripping value from the production system like copper pipes from a house—threatening the sustainability of the studios themselves. Today’s business side does not have a necessary vested interest in “the business”—in the health of what we think of as Hollywood, a place and system in which creativity is exchanged for capital. The union wins did not begin to address this fundamental problem.
To the new bosses, the quantity of money that studios had been spending on developing screenplays—many of which would never be made—was obvious fat to be cut, and in the late Aughts, executives increasingly began offering one-step deals, guaranteeing only one round of pay for one round of work. Writers, hoping to make it past Go, began doing much more labor—multiple steps of development—for what was ostensibly one step of the process. In separate interviews, Dana Stevens, writer of The Woman King, and Robin Swicord described the change using exactly the same words: “Free work was encoded.” So was safe material. In an effort to anticipate what a studio would green-light, writers incorporated feedback from producers and junior executives, constructing what became known as producer’s drafts. As Rodman explained it: “Your producer says to you, ‘I love your script. It’s a great first draft. But I know what the studio wants. This isn’t it. So I need you to just make this protagonist more likable, and blah, blah, blah.’ And you do it.”
By 2019, the major Hollywood agencies had been consolidated into an oligopoly of four companies that controlled more than 75 percent of WGA writers’ earnings. And in the 2010s, high finance reached the agencies: by 2014, private equity had acquired Creative Artists Agency and William Morris Endeavor, and the latter had purchased IMG. Meeting benchmarks legible to the new bosses—deals actually made, projects off the ground—pushed agents to function more like producers, and writers began hearing that their asking prices were too high.
Executives, meanwhile, increasingly believed that they’d found their best bet in “IP”: preexisting intellectual property—familiar stories, characters, and products—that could be milled for scripts. As an associate producer of a successful Aughts IP-driven franchise told me, IP is “sort of a hedge.” There’s some knowledge of the consumer’s interest, he said. “There’s a sort of dry run for the story.” Screenwriter Zack Stentz, who co-wrote the 2011 movies Thor and X-Men: First Class, told me, “It’s a way to take risk out of the equation as much as possible.”
Multiple writers I spoke with said that selecting preexisting characters and cinematic worlds gave executives a type of psychic edge, allowing them to claim a degree of creative credit. And as IP took over, the perceived authority of writers diminished. Julie Bush, a writer-producer for the Apple TV+ limited series Manhunt, told me, “Executives get to feel like the author of the work, even though they have a screenwriter, like me, basically create a story out of whole cloth.” At the same time, the biggest IP success story, the Marvel Cinematic Universe, by far the highest-earning franchise of all time, pioneered a production apparatus in which writers were often separated from the conception and creation of a movie’s overall story.
Joanna Robinson, co-author of the book MCU: The Reign of Marvel Studios, told me that the writers for WandaVision, a Marvel show for Disney+, had to craft almost the entirety of the series’ single season without knowing where their work was ultimately supposed to arrive: the ending remained undetermined, because executives had not yet decided what other stories they might spin off from the show.
The streaming ecosystem was built on a wager: high subscriber numbers would translate to large market shares, and eventually, profit. Under this strategy, an enormous amount of money could be spent on shows that might or might not work: more shows meant more opportunities to catch new subscribers. Producers and writers for streamers were able to put ratings aside, which at first seemed to be a luxury. Netflix paid writers large fees up front, and guaranteed that an entire season of a show would be produced. By the mid-2010s, the sheer quantity of series across the new platforms—what’s known as “Peak TV”—opened opportunities for unusually offbeat projects (see BoJack Horseman, a cartoon for adults about an equine has-been sitcom star), and substantially more shows created by women and writers of color. In 2009, across cable, broadcast, and streaming, 189 original scripted shows aired or released new episodes; in 2016, that number was 496. In 2022, it was 849.
supply soon overshot demand. For those who beat out the competition, the work became much less steady than it had been in the pre-streaming era. According to insiders, in the past, writers for a series had usually been employed for around eight months, crafting long seasons and staying on board through a show’s production. Junior writers often went to the sets where their shows were made and learned how to take a story from the page to the screen—how to talk to actors, how to stay within budget, how to take a studio’s notes—setting them up to become showrunners. Now, in an innovation called mini-rooms, reportedly first ventured by cable channels such as AMC and Starz, fewer writers were employed for each series and for much shorter periods—usually eight to ten weeks but as little as four.
Writers in the new mini-room system were often dismissed before their series went to production, which meant that they rarely got the opportunity to go to set and weren’t getting the skills they needed to advance. Showrunners were left responsible for all writing-related tasks when these rooms shut down. “It broke a lot of showrunners,” the A-list film and TV writer told me. “Physically, mentally, financially. It also ruined a lot of shows.”
The price of entry for working in Hollywood had been high for a long time: unpaid internships, low-paid assistant jobs. But now the path beyond the entry level was increasingly unclear. Jason Grote, who was a staff writer on Mad Men and who came to TV from playwriting, told me, “It became like a hobby for people, or something more like theater—you had your other day jobs or you had a trust fund.” Brenden Gallagher, a TV writer a decade in, said, “There are periods of time where I work at the Apple Store. I’ve worked doing data entry, I’ve worked doing research, I’ve worked doing copywriting.” Since he’d started in the business in 2014, in his mid-twenties, he’d never had more than eight months at a time when he didn’t need a source of income from outside the industry.
“There was this feeling,” the head of the midsize studio told me that day at Soho House, “during the last ten years or so, of, ‘Oh, we need to get more people of color in writers’ rooms.’ ” But what you get now, he said, is the black or Latino person who went to Harvard. “They’re getting the shot, but you don’t actually see a widening of the aperture to include people who grew up poor, maybe went to a state school or not even, and are just really talented. That has not happened at all.”
“The Sopranos does not exist without David Chase having worked in television for almost thirty years,” Blake Masters, a writer-producer and creator of the Showtime series Brotherhood, told me. “Because The Sopranos really could not be written by somebody unless they understood everything about television, and hated all of it.” Grote said much the same thing: “Prestige TV wasn’t new blood coming into Hollywood as much as it was a lot of veterans that were never able to tell these types of stories, who were suddenly able to cut through.”
The threshold for receiving the viewership-based streaming residuals is also incredibly high: a show must be viewed by at least 20 percent of a platform’s domestic subscribers “in the first 90 days of release, or in the first 90 days in any subsequent exhibition year.” As Bloomberg reported in November, fewer than 5 percent of the original shows that streamed on Netflix in 2022 would have met this benchmark. “I am not impressed,” the A-list writer told me in January. Entry-level TV staffing, where more and more writers are getting stuck, “is still a subsistence-level job,” he said. “It’s a job for rich kids.”
Brenden Gallagher, who echoed Conover’s belief that the union was well-positioned to gain more in 2026, put it this way: “My view is that there was a lot of wishful thinking about achieving this new middle class, based around, to paraphrase 30 Rock, making it 1997 again through science or magic. Will there be as big a working television-writer cohort that is making six figures a year consistently living in Los Angeles as there was from 1992 to 2021? No. That’s never going to come back.”
As for what types of TV and movies can get made by those who stick around, Kelvin Yu, creator and showrunner of the Disney+ series American Born Chinese, told me: “I think that there will be an industry move to the middle in terms of safer, four-quadrant TV.” (In L.A., a “four-quadrant” project is one that aims to appeal to all demographics.) “I think a lot of people,” he said, “who were disenfranchised or marginalized—their drink tickets are up.” Indeed, multiple writers and executives told me that following the strike, studio choices have skewed even more conservative than before. “It seems like buyers are much less adventurous,” one writer said. “Buyers are looking for Friends.”
The film and TV industry is now controlled by only four major companies, and it is shot through with incentives to devalue the actual production of film and television.
The entertainment and finance industries spend enormous sums lobbying both parties to maintain deregulation and prioritize the private sector. Writers will have to fight the studios again, but for more sweeping reforms. One change in particular has the potential to flip the power structure of the industry on its head: writers could demand to own complete copyright for the stories they create. They currently have something called “separated rights,” which allow a writer to use a script and its characters for limited purposes. But if they were to retain complete copyright, they would have vastly more leverage. Nearly every writer I spoke with seemed to believe that this would present a conflict with the way the union functions. This point is complicated and debatable, but Shawna Kidman and the legal expert Catherine Fisk—both preeminent scholars of copyright and media—told me that the greater challenge is Hollywood’s structure. The business is currently built around studio ownership. While Kidman found the idea of writer ownership infeasible, Fisk said it was possible, though it would be extremely difficult. Pushing for copyright would essentially mean going to war with the studios. But if things continue on their current path, writers may have to weigh such hazards against the prospect of the end of their profession. Or, they could leave it all behind.
·harpers.org·
The Life and Death of Hollywood, by Daniel Bessner
Western Music Isn't What You Think
Western Music Isn't What You Think
Western culture and music have been heavily influenced by outside, non-Western sources, contrary to common perceptions. The author argues that diversity and cross-cultural exchange are key strengths of Western culture.
·honest-broker.com·
Western Music Isn't What You Think
Looking for AI use-cases — Benedict Evans
Looking for AI use-cases — Benedict Evans
  • LLMs have impressive capabilities, but many people struggle to find immediate use-cases that match their own needs and workflows.
  • Realizing the potential of LLMs requires not just technical advancements, but also identifying specific problems that can be automated and building dedicated applications around them.
  • The adoption of new technologies often follows a pattern of initially trying to fit them into existing workflows, before eventually changing workflows to better leverage the new tools.
if you had showed VisiCalc to a lawyer or a graphic designer, their response might well have been ‘that’s amazing, and maybe my book-keeper should see this, but I don’t do that’. Lawyers needed a word processor, and graphic designers needed (say) Postscript, Pagemaker and Photoshop, and that took longer.
I’ve been thinking about this problem a lot in the last 18 months, as I’ve experimented with ChatGPT, Gemini, Claude and all the other chatbots that have sprouted up: ‘this is amazing, but I don’t have that use-case’.
A spreadsheet can’t do word processing or graphic design, and a PC can do all of those but someone needs to write those applications for you first, one use-case at a time.
no matter how good the tech is, you have to think of the use-case. You have to see it. You have to notice something you spend a lot of time doing and realise that it could be automated with a tool like this.
Some of this is about imagination, and familiarity. It reminds me a little of the early days of Google, when we were so used to hand-crafting our solutions to problems that it took time to realise that you could ‘just Google that’.
This is also, perhaps, matching a classic pattern for the adoption of new technology: you start by making it fit the things you already do, where it’s easy and obvious to see that this is a use-case, if you have one, and then later, over time, you change the way you work to fit the new tool.
The concept of product-market fit is that normally you have to iterate your idea of the product and your idea of the use-case and customer towards each other - and then you need sales.
Meanwhile, spreadsheets were both a use-case for a PC and a general-purpose substrate in their own right, just as email or SQL might be, and yet all of those have been unbundled. The typical big company today uses hundreds of different SaaS apps, all them, so to speak, unbundling something out of Excel, Oracle or Outlook. All of them, at their core, are an idea for a problem and an idea for a workflow to solve that problem, that is easier to grasp and deploy than saying ‘you could do that in Excel!’ Rather, you instantiate the problem and the solution in software - ‘wrap it’, indeed - and sell that to a CIO. You sell them a problem.
there’s a ‘Cambrian Explosion’ of startups using OpenAI or Anthropic APIs to build single-purpose dedicated apps that aim at one problem and wrap it in hand-built UI, tooling and enterprise sales, much as a previous generation did with SQL.
Back in 1982, my father had one (1) electric drill, but since then tool companies have turned that into a whole constellation of battery-powered electric hole-makers. One upon a time every startup had SQL inside, but that wasn’t the product, and now every startup will have LLMs inside.
people are still creating companies based on realising that X or Y is a problem, realising that it can be turned into pattern recognition, and then going out and selling that problem.
A GUI tells the users what they can do, but it also tells the computer everything we already know about the problem, and with a general-purpose, open-ended prompt, the user has to think of all of that themselves, every single time, or hope it’s already in the training data. So, can the GUI itself be generative? Or do we need another whole generation of Dan Bricklins to see the problem, and then turn it into apps, thousands of them, one at a time, each of them with some LLM somewhere under the hood?
The change would be that these new use-cases would be things that are still automated one-at-a-time, but that could not have been automated before, or that would have needed far more software (and capital) to automate. That would make LLMs the new SQL, not the new HAL9000.
·ben-evans.com·
Looking for AI use-cases — Benedict Evans
How McKinsey Destroyed the Middle Class - The Atlantic
How McKinsey Destroyed the Middle Class - The Atlantic

The rise of management consulting firms like McKinsey played a pivotal role in disempowering the American middle class by promoting corporate restructuring that concentrated power and wealth in the hands of elite managers while stripping middle managers and workers of their decision-making roles, job security, and opportunities for career advancement.

Key topics:

  • Management consulting's role in reshaping corporate America
  • The decline of the middle class and the rise of corporate elitism
  • McKinsey's influence on corporate restructuring and inequality
  • The shift from lifetime employment to precarious jobs
  • The erosion of corporate social responsibility
  • The role of management consulting in perpetuating economic inequality
what consequences has the rise of management consulting had for the organization of American business and the lives of American workers? The answers to these questions put management consultants at the epicenter of economic inequality and the destruction of the American middle class.
Managers do not produce goods or deliver services. Instead, they plan what goods and services a company will provide, and they coordinate the production workers who make the output. Because complex goods and services require much planning and coordination, management (even though it is only indirectly productive) adds a great deal of value. And managers as a class capture much of this value as pay. This makes the question of who gets to be a manager extremely consequential.
In the middle of the last century, management saturated American corporations. Every worker, from the CEO down to production personnel, served partly as a manager, participating in planning and coordination along an unbroken continuum in which each job closely resembled its nearest neighbor.
Even production workers became, on account of lifetime employment and workplace training, functionally the lowest-level managers. They were charged with planning and coordinating the development of their own skills to serve the long-run interests of their employers.
At McDonald’s, Ed Rensi worked his way up from flipping burgers in the 1960s to become CEO. More broadly, a 1952 report by Fortune magazine found that two-thirds of senior executives had more than 20 years’ service at their current companies.
Top executives enjoyed commensurately less control and captured lower incomes. This democratic approach to management compressed the distribution of income and status. In fact, a mid-century study of General Motors published in the Harvard Business Review—completed, in a portent of what was to come, by McKinsey’s Arch Patton—found that from 1939 to 1950, hourly workers’ wages rose roughly three times faster than elite executives’ pay. The management function’s wide diffusion throughout the workforce substantially built the mid-century middle class.
The earliest consultants were engineers who advised factory owners on measuring and improving efficiency at the complex factories required for industrial production. The then-leading firm, Booz Allen, did not achieve annual revenues of $2 million until after the Second World War. McKinsey, which didn’t hire its first Harvard M.B.A. until 1953, retained a diffident and traditional ethos
A new ideal of shareholder primacy, powerfully championed by Milton Friedman in a 1970 New York Times Magazine article entitled “The Social Responsibility of Business is to Increase its Profits,” gave the newly ambitious management consultants a guiding purpose. According to this ideal, in language eventually adopted by the Business Roundtable, “the paramount duty of management and of boards of directors is to the corporation’s stockholders.” During the 1970s, and accelerating into the ’80s and ’90s, the upgraded management consultants pursued this duty by expressly and relentlessly taking aim at the middle managers who had dominated mid-century firms, and whose wages weighed down the bottom line.
Management consultants thus implemented and rationalized a transformation in the American corporation. Companies that had long affirmed express “no layoff” policies now took aim at what the corporate raider Carl Icahn, writing in the The New York Times in the late 1980s, called “corporate bureaucracies” run by “incompetent” and “inbred” middle managers. They downsized in response not to particular business problems but rather to a new managerial ethos and methods; they downsized when profitable as well as when struggling, and during booms as well as busts.
Downsizing was indeed wrenching. When IBM abandoned lifetime employment in the 1990s, local officials asked gun-shop owners around its headquarters to close their stores while employees absorbed the shock.
In some cases, downsized employees have been hired back as subcontractors, with no long-term claim on the companies and no role in running them. When IBM laid off masses of workers in the 1990s, for example, it hired back one in five as consultants. Other corporations were built from scratch on a subcontracting model. The clothing brand United Colors of Benetton has only 1,500 employees but uses 25,000 workers through subcontractors.
Shift from lifetime employment to reliance on outsourced labor; decline in unions
The shift from permanent to precarious jobs continues apace. Buttigieg’s work at McKinsey included an engagement for Blue Cross Blue Shield of Michigan, during a period when it considered cutting up to 1,000 jobs (or 10 percent of its workforce). And the gig economy is just a high-tech generalization of the sub-contractor model. Uber is a more extreme Benetton; it deprives drivers of any role in planning and coordination, and it has literally no corporate hierarchy through which drivers can rise up to join management.
In effect, management consulting is a tool that allows corporations to replace lifetime employees with short-term, part-time, and even subcontracted workers, hired under ever more tightly controlled arrangements, who sell particular skills and even specified outputs, and who manage nothing at all.
the managerial control stripped from middle managers and production workers has been concentrated in a narrow cadre of executives who monopolize planning and coordination. Mid-century, democratic management empowered ordinary workers and disempowered elite executives, so that a bad CEO could do little to harm a company and a good one little to help it.
Whereas at mid-century a typical large-company CEO made 20 times a production worker’s income, today’s CEOs make nearly 300 times as much. In a recent year, the five highest-paid employees of the S&P 1500 (7,500 elite executives overall), obtained income equal to about 10 percent of the total profits of the entire S&P 1500.
as Kiechel put it dryly, “we are not all in this together; some pigs are smarter than other pigs and deserve more money.” Consultants seek, in this way, to legitimate both the job cuts and the explosion of elite pay. Properly understood, the corporate reorganizations were, then, not merely technocratic but ideological.
corporate reorganizations have deprived companies of an internal supply of managerial workers. When restructurings eradicated workplace training and purged the middle rungs of the corporate ladder, they also forced companies to look beyond their walls for managerial talent—to elite colleges, business schools, and (of course) to management-consulting firms. That is to say: The administrative techniques that management consultants invented created a huge demand for precisely the services that the consultants supply.
Consulting, like law school, is an all-purpose status giver—“low in risk and high in reward,” according to the Harvard Crimson. McKinsey also hopes that its meritocratic excellence will legitimate its activities in the eyes of the broader world. Management consulting, Kiechel observed, acquired its power and authority not from “silver-haired industry experience but rather from the brilliance of its ideas and the obvious candlepower of the people explaining them, even if those people were twenty-eight years old.”
A deeper objection to Buttigieg’s association with McKinsey concerns not whom the firm represents but the central role the consulting revolution has played in fueling the enormous economic inequalities that now threaten to turn the United States into a caste society.
Meritocrats like Buttigieg changed not just corporate strategies but also corporate values.
GM may aspire to build good cars; IBM, to make typewriters, computers, and other business machines; and AT&T, to improve communications. Executives who rose up through these companies, on the mid-century model, were embedded in their firms and embraced these values, so that they might even have come to view profits as a salutary side effect of running their businesses well.
When management consulting untethered executives from particular industries or firms and tied them instead to management in general, it also led them to embrace the one thing common to all corporations: making money for shareholders. Executives raised on the new, untethered model of management aim exclusively and directly at profit: their education, their career arc, and their professional role conspire to isolate them from other workers and train them single-mindedly on the bottom line.
American democracy, the left believes, cannot be rejuvenated by persuading elites to deploy their excessive power somehow more benevolently. Instead, it requires breaking the stranglehold that elites have on our economics and politics, and reempowering everyone else.
·archive.is·
How McKinsey Destroyed the Middle Class - The Atlantic
From Tech Critique to Ways of Living — The New Atlantis
From Tech Critique to Ways of Living — The New Atlantis
Yuk Hui's concept of "cosmotechnics" combines technology with morality and cosmology. Inspired by Daoism, it envisions a world where advanced tech exists but cultures favor simpler, purposeful tools that guide people towards contentment by focusing on local, relational, and ironic elements. A Daoist cosmotechnics points to alternative practices and priorities - learning how to live from nature rather than treating it as a resource to be exploited, valuing embodied relation over abstract information
We might think of the shifting relationship of human beings to the natural world in the terms offered by German sociologist Gerd-Günter Voß, who has traced our movement through three different models of the “conduct of life.”
The first, and for much of human history the only conduct of life, is what he calls the traditional. Your actions within the traditional conduct of life proceed from social and familial circumstances, from what is thus handed down to you. In such a world it is reasonable for family names to be associated with trades, trades that will be passed down from father to son: Smith, Carpenter, Miller.
But the rise of the various forces that we call “modernity” led to the emergence of the strategic conduct of life: a life with a plan, with certain goals — to get into law school, to become a cosmetologist, to get a corner office.
thanks largely to totalizing technology’s formation of a world in which, to borrow a phrase from Marx and Engels, “all that is solid melts into air,” the strategic model of conduct is replaced by the situational. Instead of being systematic planners, we become agile improvisers: If the job market is bad for your college major, you turn a side hustle into a business. But because you know that your business may get disrupted by the tech industry, you don’t bother thinking long-term; your current gig might disappear at any time, but another will surely present itself, which you will assess upon its arrival.
The movement through these three forms of conduct, whatever benefits it might have, makes our relations with nature increasingly instrumental. We can see this shift more clearly when looking at our changing experience of time
Within the traditional conduct of life, it is necessary to take stewardly care of the resources required for the exercise of a craft or a profession, as these get passed on from generation to generation.
But in the progression from the traditional to the strategic to the situational conduct of life, continuity of preservation becomes less valuable than immediacy of appropriation: We need more lithium today, and merely hope to find greater reserves — or a suitable replacement — tomorrow. This revaluation has the effect of shifting the place of the natural order from something intrinsic to our practices to something extrinsic. The whole of nature becomes what economists tellingly call an externality.
The basic argument of the SCT goes like this. We live in a technopoly, a society in which powerful technologies come to dominate the people they are supposed to serve, and reshape us in their image. These technologies, therefore, might be called prescriptive (to use Franklin’s term) or manipulatory (to use Illich’s). For example, social networks promise to forge connections — but they also encourage mob rule.
all things increasingly present themselves to us as technological: we see them and treat them as what Heidegger calls a “standing reserve,” supplies in a storeroom, as it were, pieces of inventory to be ordered and conscripted, assembled and disassembled, set up and set aside
In his exceptionally ambitious book The Question Concerning Technology in China (2016) and in a series of related essays and interviews, Hui argues, as the title of his book suggests, that we go wrong when we assume that there is one question concerning technology, the question, that is universal in scope and uniform in shape. Perhaps the questions are different in Hong Kong than in the Black Forest. Similarly, the distinction Heidegger draws between ancient and modern technology — where with modern technology everything becomes a mere resource — may not universally hold.
Thesis: Technology is an anthropological universal, understood as an exteriorization of memory and the liberation of organs, as some anthropologists and philosophers of technology have formulated it; Antithesis: Technology is not anthropologically universal; it is enabled and constrained by particular cosmologies, which go beyond mere functionality or utility. Therefore, there is no one single technology, but rather multiple cosmotechnics.
osmotechnics is the integration of a culture's worldview and ethical framework with its technological practices, illustrating that technology is not just about functionality but also embodies a way of life realized through making.
I think Hui’s cosmotechnics, generously leavened with the ironic humor intrinsic to Daoism, provides a genuine Way — pun intended — beyond the limitations of the Standard Critique of Technology. I say this even though I am not a Daoist; I am, rather, a Christian. But it should be noted that Daoism is both daojiao, an organized religion, and daojia, a philosophical tradition. It is daojia that Hui advocates, which makes the wisdom of Daoism accessible and attractive to a Christian like me. Indeed, I believe that elements of daojia are profoundly consonant with Christianity, and yet underdeveloped in the Christian tradition, except in certain modes of Franciscan spirituality, for reasons too complex to get into here.
this technological Daoism as an embodiment of daojia, is accessible to people of any religious tradition or none. It provides a comprehensive and positive account of the world and one’s place in it that makes a different approach to technology more plausible and compelling. The SCT tends only to gesture in the direction of a model of human flourishing, evokes it mainly by implication, whereas Yuk Hui’s Daoist model gives an explicit and quite beautiful account.
The application of Daoist principles is most obvious, as the above exposition suggests, for “users” who would like to graduate to the status of “non-users”: those who quietly turn their attention to more holistic and convivial technologies, or who simply sit or walk contemplatively. But in the interview I quoted from earlier, Hui says, “Some have quipped that what I am speaking about is Daoist robots or organic AI” — and this needs to be more than a quip. Peter Thiel’s longstanding attempt to make everyone a disciple of René Girard is a dead end. What we need is a Daoist culture of coders, and people devoted to “action without acting” making decisions about lithium mining.
Tools that do not contribute to the Way will neither be worshipped nor despised. They will simply be left to gather dust as the people choose the tools that will guide them in the path of contentment and joy: utensils to cook food, devices to make clothes. Of course, the food of one village will differ from that of another, as will the clothing. Those who follow the Way will dwell among the “ten thousand things” of this world — what we call nature — in a certain manner that cannot be specified legally: Verse 18 of the Tao says that when virtue arises only from rules, that is a sure sign that the Way is not present and active. A cosmotechnics is a living thing, always local in the specifics of its emergence in ways that cannot be specified in advance.
It is from the ten thousand things that we learn how to live among the ten thousand things; and our choice of tools will be guided by what we have learned from that prior and foundational set of relations. This is cosmotechnics.
Multiplicity avoids the universalizing, totalizing character of technopoly. The adherents of technopoly, Hui writes, “wishfully believ[e] that the world process will stamp out differences and diversities” and thereby achieve a kind of techno-secular “theodicy,” a justification of the ways of technopoly to its human subjects. But the idea of multiple cosmotechnics is also necessary, Hui believes, in order to avoid the simply delusional attempt to find “a way out of modernity” by focusing on the indigenous or biological “Other.” An aggressive hostility to modernity and a fetishizing of pre-modernity is not the Daoist way.
“I believe that to overcome modernity without falling back into war and fascism, it is necessary to reappropriate modern technology through the renewed framework of a cosmotechnics.” His project “doesn’t refuse modern technology, but rather looks into the possibility of different technological futures.”
“Thinking rooted in the earthy virtue of place is the motor of cosmotechnics. However, for me, this discourse on locality doesn’t mean a refusal of change and of progress, or any kind of homecoming or return to traditionalism; rather, it aims at a re-appropriation of technology from the perspective of the local and a new understanding of history.”
Always Coming Home illustrates cosmotechnics in a hundred ways. Consider, for instance, information storage and retrieval. At one point we meet the archivist of the Library of the Madrone Lodge in the village of Wakwaha-na. A visitor from our world is horrified to learn that while the library gives certain texts and recordings to the City of Mind, some of their documents they simply destroy. “But that’s the point of information storage and retrieval systems! The material is kept for anyone who wants or needs it. Information is passed on — the central act of human culture.” But that is not how the librarian thinks about it. “Tangible or intangible, either you keep a thing or you give it. We find it safer to give it” — to practice “unhoarding.”
It is not information, but relation. This too is cosmotechnics.
The modern technological view treats information as a resource to be stored and optimized. But the archivist in Le Guin's Daoist-inspired society takes a different approach, one where documents can be freely discarded because what matters is not the hoarding of information but the living of life in sustainable relation
a cosmotechnics is the point at which a way of life is realized through making. The point may be illustrated with reference to an ancient tale Hui offers, about an excellent butcher who explains to a duke what he calls the Dao, or “way,” of butchering. The reason he is a good butcher, he says, it not his mastery of a skill, or his reliance on superior tools. He is a good butcher because he understands the Dao: Through experience he has come to rely on his intuition to thrust the knife precisely where it does not cut through tendons or bones, and so his knife always stays sharp. The duke replies: “Now I know how to live.” Hui explains that “it is thus the question of ‘living,’ rather than that of technics, that is at the center of the story.”
·thenewatlantis.com·
From Tech Critique to Ways of Living — The New Atlantis
Fandom's Great Divide
Fandom's Great Divide
The 1970s sitcom "All in the Family" sparked debates with its bigoted-yet-lovable Archie Bunker character, leaving audiences divided over whether the show was satirizing prejudice or inadvertently promoting it, and reflecting TV's power to shape societal attitudes.
This sort of audience divide, not between those who love a show and those who hate it but between those who love it in very different ways, has become a familiar schism in the past fifteen years, during the rise of—oh, God, that phrase again—Golden Age television. This is particularly true of the much lauded stream of cable “dark dramas,” whose protagonists shimmer between the repulsive and the magnetic. As anyone who has ever read the comments on a recap can tell you, there has always been a less ambivalent way of regarding an antihero: as a hero
a subset of viewers cheered for Walter White on “Breaking Bad,” growling threats at anyone who nagged him to stop selling meth. In a blog post about that brilliant series, I labelled these viewers “bad fans,” and the responses I got made me feel as if I’d poured a bucket of oil onto a flame war from the parapets of my snobby critical castle. Truthfully, my haters had a point: who wants to hear that they’re watching something wrong?
·newyorker.com·
Fandom's Great Divide
Competition is overrated - cdixon
Competition is overrated - cdixon
That other people tried your idea without success could imply it’s a bad idea or simply that the timing or execution was wrong. Distinguishing between these cases is hard and where you should apply serious thought. If you think your competitors executed poorly, you should develop a theory of what they did wrong and how you’ll do better.
If you think your competitor’s timing was off, you should have a thesis about what’s changed to make now the right time. These changes could come in a variety of forms: for example, it could be that users have become more sophisticated, the prices of key inputs have dropped, or that prerequisite technologies have become widely adopted.
Startups are primarly competing against indifference, lack of awareness, and lack of understanding — not other startups.
There were probably 50 companies that tried to do viral video sharing before YouTube. Before 2005, when YouTube was founded, relatively few users had broadband and video cameras. YouTube also took advantage of the latest version of Flash that could play videos seamlessly.
Google and Facebook launched long after their competitors, but executed incredibly well and focused on the right things. When Google launched, other search engines like Yahoo, Excite, and Lycos were focused on becoming multipurpose “portals” and had de-prioritized search (Yahoo even outsourced their search technology).
·cdixon.org·
Competition is overrated - cdixon
Muse retrospective by Adam Wiggins
Muse retrospective by Adam Wiggins
  • Wiggins focused on storytelling and brand-building for Muse, achieving early success with an email newsletter, which helped engage potential users and refine the product's value proposition.
  • Muse aspired to a "small giants" business model, emphasizing quality, autonomy, and a healthy work environment over rapid growth. They sought to avoid additional funding rounds by charging a prosumer price early on.
  • Short demo videos on Twitter showcasing the app in action proved to be the most effective method for attracting new users.
Muse as a brand and a product represented something aspirational. People want to be deeper thinkers, to be more strategic, and to use cool, status-quo challenging software made by small passionate teams. These kinds of aspirations are easier to indulge in times of plenty. But once you're getting laid off from your high-paying tech job, or struggling to raise your next financing round, or scrambling to protect your kids' college fund from runaway inflation and uncertain markets... I guess you don't have time to be excited about cool demos on Twitter and thoughtful podcasts on product design.
I’d speculate that another factor is the half-life of cool new productivity software. Evernote, Slack, Notion, Roam, Craft, and many others seem to get pretty far on community excitement for their first few years. After that, I think you have to be left with software that serves a deep and hard-to-replace purpose in people’s lives. Muse got there for a few thousand people, but the economics of prosumer software means that just isn’t enough. You need tens of thousands, hundreds of thousands, to make the cost of development sustainable.
We envisioned Muse as the perfect combination of the freeform elements of a whiteboard, the structured text-heavy style of Notion or Google Docs, and the sense of place you get from a “virtual office” ala group chat. As a way to asynchronously trade ideas and inspiration, sketch out project ideas, and explore possibilities, the multiplayer Muse experience is, in my honest opinion, unparalleled for small creative teams working remotely.
But friction began almost immediately. The team lead or organizer was usually the one bringing Muse to the team, and they were already a fan of its approach. But the other team members are generally a little annoyed to have to learn any new tool, and Muse’s steeper learning curve only made that worse. Those team members would push the problem back to the team lead, treating them as customer support (rather than contacting us directly for help). The team lead often felt like too much of the burden of pushing Muse adoption was on their shoulders. This was in addition to the obvious product gaps, like: no support for the web or Windows; minimal or no integration with other key tools like Notion and Google Docs; and no permissions or support for multiple workspaces. Had we raised $10M back during the cash party of 2020–2021, we could have hired the 15+ person team that would have been necessary to build all of that. But with only seven people (we had added two more people to the team in 2021–2022), it just wasn’t feasible.
We neither focused on a particular vertical (academics, designers, authors...) or a narrow use case (PDF reading/annotation, collaborative whiteboarding, design sketching...). That meant we were always spread pretty thin in terms of feature development, and marketing was difficult even over and above the problem of explaining canvas software and digital thinking tools.
being general-purpose was in its blood from birth. Part of it was maker's hubris: don't we always dream of general-purpose tools that will be everything to everyone? And part of it was that it's truly the case that Muse excels at the ability to combine together so many different related knowledge tasks and media types into a single, minimal, powerful canvas. Not sure what I would do differently here, even with the benefit of hindsight.
Muse built a lot of its reputation on being principled, but we were maybe too cautious to do the mercenary things that help you succeed. A good example here is asking users for ratings; I felt like this was not to user benefit and distracting when the user is trying to use your app. Our App Store rating was on the low side (~3.9 stars) for most of our existence. When we finally added the standard prompt-for-rating dialog, it instantly shot up to ~4.7 stars. This was a small example of being too principled about doing good for the user, and not thinking about what would benefit our business.
Growing the team slowly was a delight. At several previous ventures, I've onboard people in the hiring-is-job-one environment of a growth startup. At Muse, we started with three founders and then hired roughly one person per year. This was absolutely fantastic for being able to really take our time to find the perfect person for the role, and then for that person to have tons of time to onboard and find their footing on the team before anyone new showed up. The resulting team was the best I've ever worked on, with minimal deadweight or emotional baggage.
ultimately your product does have to have some web presence. My biggest regret is not building a simple share-to-web function early on, which could have created some virality and a great deal of utility for users as well.
In terms of development speed, quality of the resulting product, hardware integration, and a million other things: native app development wins.
After decades working in product development, being on the marketing/brand/growth/storytelling side was a huge personal challenge for me. But I feel like I managed to grow into the role and find my own approach (podcasting, demo videos, etc) to create a beacon to attract potential customers to our product.
when it comes time for an individual or a team to sit down and sketch out the beginnings of a new business, a new book, a new piece of art—this almost never happens at a computer. Or if it does, it’s a cobbled-together collection of tools like Google Docs and Zoom which aren’t really made for this critical part of the creative lifecycle.
any given business will find a small number of highly-effective channels, and the rest don't matter. For Heroku, that was attending developer conferences and getting blog posts on Hacker News. For another business it might be YouTube influencer sponsorships and print ads in a niche magazine. So I set about systematically testing many channels.
·adamwiggins.com·
Muse retrospective by Adam Wiggins
Strong and weak technologies - cdixon
Strong and weak technologies - cdixon
Strong technologies capture the imaginations of technology enthusiasts. That is why many important technologies start out as weekend hobbies. Enthusiasts vote with their time, and, unlike most of the business world, have long-term horizons. They build from first principles, making full use of the available resources to design technologies as they ought to exist.
·cdixon.org·
Strong and weak technologies - cdixon
Tools for Thought as Cultural Practices, not Computational Objects
Tools for Thought as Cultural Practices, not Computational Objects
Summary: Throughout human history, innovations like written language, drawing, maps, the scientific method, and data visualization have profoundly expanded the kinds of thoughts humans can think. Most of these "tools for thought" significantly predate digital computers. The modern usage of the phrase is heavily influenced by the work of computer scientists and technologists in the 20th century who envisioned how computers could become tools to extend human reasoning and help solve complex problems. While computers are powerful "meta-mediums", the current focus on building note-taking apps is quite narrow. To truly expand human cognition, we should explore a wider range of tools and practices, both digital and non-digital.
Taken at face value, the phrase tool for thought doesn't have the word 'computer' or 'digital' anywhere in it. It suggests nothing about software systems or interfaces. It's simply meant to refer to tools that help humans think thoughts; potentially new, different, and better kinds of thoughts than we currently think.
Most of the examples I listed above are cultural practices and techniques. They are primary ways of doing; specific ways of thinking and acting that result in greater cognitive abilities. Ones that people pass down from generation to generation through culture. Every one of these also pre-dates digital computers by at least a few hundred years, if not thousands or tens of thousands. Given that framing, it's time to return to the question of how computation, software objects, and note-taking apps fit into this narrative.
If you look around at the commonly cited “major thinkers” in this space, you get a list of computer programmers: Kenneth Iverson, J.C.R. Licklider, Vannevar Bush, Alan Kay, Bob Taylor, Douglas Englebart, Seymour Papert, Bret Victor, and Howard Rheingold, among others.
This is relevant because it means these men share a lot of the same beliefs, values, and context. They know the same sorts of people, learned the same historical stories in school and were taught to see the world in particular kinds of ways. Most of them worked together, or are at most one personal connection away from the next. Tools for thought is a community scene as much as it's a concept. This gives tools for thought a distinctly computer-oriented, male, American, middle-class flavour. The term has always been used in relation to a dream that is deeply intertwined with digital machines, white-collar knowledge work, and bold American optimism.
Englebart was specifically concerned with our ability to deal with complex problems, rather than simply “amplifying intelligence.” Being able to win a chess match is perceived as intelligent, but it isn't helping us tackle systemic racism or inequality. Englebart argued we should instead focus on “augmenting human intellect” in ways that help us find solutions to wicked problems. While he painted visions of how computers could facilitate this, he also pointed to organisational structures, system dynamics, and effective training as part of this puzzle.
There is a rich literature of research and insight into how we might expand human thought that sometimes feels entirely detached from the history we just covered. Cognitive scientists and philosophers have been tackling questions about the relationship between cognition, our tools, and our physical environments for centuries. Well before microprocessors and hypertext showed up. Oddly, they're rarely cited by the computer scientists. This alternate intellectual lineage is still asking the question “how can we develop better tools for thinking?” But they don't presume the answer revolves around computers.
Proponents of embodied cognition argue that our perceptions, concepts, and cognitive processes are shaped by the physical structures of our body and the sensory experiences it provides, and that cognition cannot be fully understood without considering the bodily basis of our experiences.
Philosopher Andy Clark has spent his career exploring how external tools transform and expand human cognition. His 2003 book Natural-born Cyborgs argues humans have “always been cyborgs.” Not in the sense of embedding wires into our flesh, but in the sense we enter “into deep and complex relationships with nonbiological constructs, props, and aids”. Our ability to think with external objects is precisely what makes us intelligent. Clark argues “the mind” isn't simply a set of functions within the brain, but a process that happens between our bodies and the physical environment. Intelligence emerges at the intersection of humans and tools. He expanded on this idea in a follow-on book called Supersizing the Mind. It became known as the extended mind hypothesis. It's the strong version of theories like embodied cognition, situated cognition, and enacted cognition that are all the rage in cognitive science departments.
There's a scramble to make sense of all these new releases and the differences between them. YouTube and Medium explode with DIY guides, walkthrough tours, and comparison videos. The productivity and knowledge management influencer is born.[ giant wall of productivity youtube nonsense ]The strange thing is, many of these guides are only superficially about the application they're presented in. Most are teaching specific cultural techniques
Zettelkasten, spaced repetition, critical thinking.These techniques are only focused on a narrow band of human activity. Specifically, activity that white-collar knowledge workers engage in.I previously suggested we should rename TFT to CMFT (computational mediums for thought), but that doesn't go far enough. If we're being honest about our current interpretation of TFT's, we should actually rename it to CMFWCKW – computational mediums for white-collar knowledge work.
By now it should be clear that this question of developing better tools for thought can and should cover a much wider scope than developing novel note-taking software.
I do think there's a meaningful distinction between tools and mediums: Mediums are a means of communicating a thought or expressing an idea. Tools are a means of working in a medium. Tools enable specific tasks and workflows within a medium. Cameras are a tool that let people express ideas through photography. Blogs are a tool that lets people express ideas through written language. JavaScript is a tool that let people express ideas through programming. Tools and mediums require each other. This makes lines between them fuzzy.
·maggieappleton.com·
Tools for Thought as Cultural Practices, not Computational Objects
The Mac Turns Forty – Pixel Envy
The Mac Turns Forty – Pixel Envy
As for a Hall of Shame thing? That would be the slow but steady encroachment of single-window applications in MacOS, especially via Catalyst and Electron. The reason I gravitated toward MacOS in the first place is the same reason I continue to use it: it fits my mental model of how an operating system ought to work.
·pxlnv.com·
The Mac Turns Forty – Pixel Envy
Why Did I Leave Google Or, Why Did I Stay So Long? - LinkedIn
Why Did I Leave Google Or, Why Did I Stay So Long? - LinkedIn
If I had to summarize it, I would say that the signal to noise ratio is what wore me down. We start companies to build products that serve people, not to sit in meetings with lawyers.  You need to be able to answer the "what have I done for our users today" question with "not much but I got promoted" and be happy with that answer to be successful in Corp-Tech.
being part of a Corporation means that the signal to noise ratio changes dramatically.  The amount of time and effort spent on Legal, Policy, Privacy - on features that have not shipped to users yet, meant a significant waste of resources and focus. After the acquisition, we have an extremely long project that consumed many of our best engineers to align our data retention policies and tools to Google. I am not saying this is not important BUT this had zero value to our users. An ever increasing percent of our time went to non user value creation tasks and that changes the DNA of the company quickly, from customer focused to corporate guidelines focused.
the salaries are so high and the options so valuable that it creates many misalignments.  The impact of an individual product on the Corp-Tech stock is minimal so equity is basically free money.  Regardless of your performance (individually) or your product performance, you equity grows significantly so nothing you do has real economic impact on your family. The only control you have to increase your economic returns are whether you get promoted, since that drives your equity and salary payments.  This breaks the traditional tech model of risk reward.
·linkedin.com·
Why Did I Leave Google Or, Why Did I Stay So Long? - LinkedIn
What I learned getting acquired by Google
What I learned getting acquired by Google
While there were undoubtedly people who came in for the food, worked 3 hours a day, and enjoyed their early retirements, all the people I met were earnest, hard-working, and wanted to do great work. What beat them down were the gauntlet of reviews, the frequent re-orgs, the institutional scar tissue from past failures, and the complexity of doing even simple things on the world stage. Startups can afford to ignore many concerns, Googlers rarely can. What also got in the way were the people themselves - all the smart people who could argue against anything but not for something, all the leaders who lacked the courage to speak the uncomfortable truth, and all the people that were hired without a clear project to work on, but must still be retained through promotion-worthy made-up work.
Another blocker to progress that I saw up close was the imbalance of a top heavy team. A team with multiple successful co-founders and 10-20 year Google veterans might sound like a recipe for great things, but it’s also a recipe for gridlock. This structure might work if there are multiple areas to explore, clear goals, and strong autonomy to pursue those paths.
Good teams regularly pay down debt by cleaning things up on quieter days. Just as real is process debt. A review added because of a launch gone wrong. A new legal check to guard against possible litigation. A section added to a document template. Layers accumulate over the years until you end up unable to release a new feature for months after it's ready because it's stuck between reviews, with an unclear path out.
·shreyans.org·
What I learned getting acquired by Google
Omegle's Rise and Fall - A Vision for Internet Connection
Omegle's Rise and Fall - A Vision for Internet Connection
As much as I wish circumstances were different, the stress and expense of this fight – coupled with the existing stress and expense of operating Omegle, and fighting its misuse – are simply too much. Operating Omegle is no longer sustainable, financially nor psychologically. Frankly, I don’t want to have a heart attack in my 30s. The battle for Omegle has been lost, but the war against the Internet rages on. Virtually every online communication service has been subject to the same kinds of attack as Omegle; and while some of them are much larger companies with much greater resources, they all have their breaking point somewhere. I worry that, unless the tide turns soon, the Internet I fell in love with may cease to exist, and in its place, we will have something closer to a souped-up version of TV – focused largely on passive consumption, with much less opportunity for active participation and genuine human connection.
I’ve done my best to weather the attacks, with the interests of Omegle’s users – and the broader principle – in mind. If something as simple as meeting random new people is forbidden, what’s next? That is far and away removed from anything that could be considered a reasonable compromise of the principle I outlined. Analogies are a limited tool, but a physical-world analogy might be shutting down Central Park because crime occurs there – or perhaps more provocatively, destroying the universe because it contains evil. A healthy, free society cannot endure when we are collectively afraid of each other to this extent.
In recent years, it seems like the whole world has become more ornery. Maybe that has something to do with the pandemic, or with political disagreements. Whatever the reason, people have become faster to attack, and slower to recognize each other’s shared humanity. One aspect of this has been a constant barrage of attacks on communication services, Omegle included, based on the behavior of a malicious subset of users. To an extent, it is reasonable to question the policies and practices of any place where crime has occurred. I have always welcomed constructive feedback; and indeed, Omegle implemented a number of improvements based on such feedback over the years. However, the recent attacks have felt anything but constructive. The only way to please these people is to stop offering the service. Sometimes they say so, explicitly and avowedly; other times, it can be inferred from their act of setting standards that are not humanly achievable. Either way, the net result is the same.
I didn’t really know what to expect when I launched Omegle. Would anyone even care about some Web site that an 18 year old kid made in his bedroom in his parents’ house in Vermont, with no marketing budget? But it became popular almost instantly after launch, and grew organically from there, reaching millions of daily users. I believe this had something to do with meeting new people being a basic human need, and with Omegle being among the best ways to fulfill that need. As the saying goes: “If you build a better mousetrap, the world will beat a path to your door.” Over the years, people have used Omegle to explore foreign cultures; to get advice about their lives from impartial third parties; and to help alleviate feelings of loneliness and isolation. I’ve even heard stories of soulmates meeting on Omegle, and getting married. Those are only some of the highlights. Unfortunately, there are also lowlights. Virtually every tool can be used for good or for evil, and that is especially true of communication tools, due to their innate flexibility. The telephone can be used to wish your grandmother “happy birthday”, but it can also be used to call in a bomb threat. There can be no honest accounting of Omegle without acknowledging that some people misused it, including to commit unspeakably heinous crimes.
As a young teenager, I couldn’t just waltz onto a college campus and tell a student: “Let’s debate moral philosophy!” I couldn’t walk up to a professor and say: “Tell me something interesting about microeconomics!” But online, I was able to meet those people, and have those conversations. I was also an avid Wikipedia editor; I contributed to open source software projects; and I often helped answer computer programming questions posed by people many years older than me. In short, the Internet opened the door to a much larger, more diverse, and more vibrant world than I would have otherwise been able to experience; and enabled me to be an active participant in, and contributor to, that world. All of this helped me to learn, and to grow into a more well-rounded person. Moreover, as a survivor of childhood rape, I was acutely aware that any time I interacted with someone in the physical world, I was risking my physical body. The Internet gave me a refuge from that fear. I was under no illusion that only good people used the Internet; but I knew that, if I said “no” to someone online, they couldn’t physically reach through the screen and hold a weapon to my head, or worse. I saw the miles of copper wires and fiber-optic cables between me and other people as a kind of shield – one that empowered me to be less isolated than my trauma and fear would have otherwise allowed.
·omegle.com·
Omegle's Rise and Fall - A Vision for Internet Connection
Why corporate America broke up with design
Why corporate America broke up with design
Design thinking alone doesn't determine market success, nor does it always transform business as expected.
There are a multitude of viable culprits behind this revenue drop. Robson himself pointed to the pandemic and tightened global budgets while arguing that “the widespread adoption of design thinking . . . has reduced demand for our services.” (Ideo was, in part, its own competition here since for years, it sold courses on design thinking.) It’s perhaps worth noting that, while design thinking was a buzzword from the ’90s to the early 2010s, it’s commonly met with all sorts of criticism today.
“People were like, ‘We did the process, why doesn’t our business transform?'” says Cliff Kuang, a UX designer and coauthor of User Friendly (and a former Fast Company editor). He points to PepsiCo, which in 2012 hired its first chief design officer and opened an in-house design studio. The investment has not yielded a string of blockbusters (and certainly no iPhone for soda). One widely promoted product, Drinkfinity, attempted to respond to diminishing soft-drink sales with K-Cup-style pods and a reusable water bottle. The design process was meticulous, with extensive prototyping and testing. But Drinkfinity had a short shelf life, discontinued within two years of its 2018 release.
“Design is rarely the thing that determines whether something succeeds in the market,” Kuang says. Take Amazon’s Kindle e-reader. “Jeff Bezos henpecked the original Kindle design to death. Because he didn’t believe in capacitive touch, he put a keyboard on it, and all this other stuff,” Kuang says. “Then the designer of the original Kindle walked and gave [the model] to Barnes & Noble.” Barnes & Noble released a product with a superior physical design, the Nook. But design was no match for distribution. According to the most recent data, Amazon owns approximately 80% of the e-book market share.
The rise of mobile computing has forced companies to create effortless user experiences—or risk getting left behind. When you hail an Uber or order toilet paper in a single click, you are reaping the benefits of carefully considered design. A 2018 McKinsey study found that companies with the strongest commitment to design and the best execution of design principles had revenue that was 32 percentage points higher—and shareholder returns that were 56 percentage points higher—than other companies.
·fastcompany.com·
Why corporate America broke up with design
the internet is one big video game
the internet is one big video game
New real-time syncing libraries like Partykit (and my inspired creation playhtml) are making it incredibly easy to make websites multiplayer, which many games incorporate as the default. This prediction is wise in a lot of ways in terms of interaction, narrative, tutorial, and multiplayer design, and more and more people desire a liveness and tactility in websites that we take for granted in video games.
Websites are the future of video games. They are the “end game” of video games. They are spaces where the end players (the website visitors) have the agency to freely interact with others, and not towards any predetermined object, but purely for themselves, discovering who they are in each new environment and finding new ways of relating to one another.
Tokimeki Memorial gives the impression where your agency comes into conflict with several others’, each with their own desires and personalities. At the end of this season, he concludes that more video games should ditch combat mechanics and instead focus on how your choice of actions question and ultimately shape who you are and what you care about.
As I watch Tim talk about all this, I think about how websites feel like multiplayer video games, all of which are part of the broader “internet” universe. One in which the “creatures” are the cursors of other, real people. And where we can’t fight each other at all, only talk to one another.
Somewhere in the push to make the internet the infrastructure of a global capitalist economy, we lost this perspective on what the internet is. If I asked people to define what websites are to them, they might talk about the capabilities they provide: “the world’s information at your fingertips,” “AI that does whatever you ask of it,” “a platform for selling products.” Or as design artifacts: they provide the basis of interactive, creative pieces of art, media, and writing. But if we distill a website down to its base components, it is a space that allows people to talk to each other. In the era when the internet was new and before we had predetermined what it was “for,” everyday internet pioneers found ways to talk to one another by making websites for each other. The conversations spanned webs of personal websites, revealing intimate detail in exchange for intimate detail. They bartered histories for kinship, stories for solidarity, identities for community.
The websites of our modern-day internet experience reflect quite a different perspective on what websites should be “for.” Websites are often the expression of a corporate unit, optimized for flow, retention, or the latest trendy design aesthetic. We focus on animation design and gradient layering rather than the interactions that govern how we relate to one another.
How do we make websites feel more like embodied objects? What does a website that can become well-worn or passed down feel like? How does a website become a living gathering space, one that evolves with the activity of its participants? How can a website enable showing care to each other? How can it facilitate solidarity between people?
As video games have shifted towards hyper-optimization, the internet has gone a similar direction. Friction has been systematically eliminated and sophisticated automated experimentation infrastructure enables optimization of key metrics at a microscopic level of detail. In return, we’ve come to view websites and the broader internet more and more as a purely utilitarian medium. Even social media, which at some point was positioned as something for self-expression and community-making has become almost entirely a space for influence climbing.
We need more websites that gently guide us to trust our own choices and intuitions, that chide us when we try to do it all and work ourselves to the bone, that nudge us to find beauty in unexpected places, to find the poetry in the lazy.
·spencers.cafe·
the internet is one big video game
A Brief History & Ethos of the Digital Garden
A Brief History & Ethos of the Digital Garden
Rather than presenting a set of polished articles, displayed in reverse chronological order, these sites act more like free form, work-in-progress wikis. A garden is a collection of evolving ideas that aren't strictly organised by their publication date. They're inherently exploratory – notes are linked through contextual associations. They aren't refined or complete - notes are published as half-finished thoughts that will grow and evolve over time. They're less rigid, less performative, and less perfect than the personal websites we're used to seeing.
It harkens back to the early days of the web when people had fewer notions of how websites "should be.” It's an ethos that is both classically old and newly imagined.
digital gardening is not about specific tools – it's not a Wordpress plugin, Gastby theme, or Jekyll template. It's a different way of thinking about our online behaviour around information - one that accumulates personal knowledge over time in an explorable space.
Gardens present information in a richly linked landscape that grows slowly over time. Everything is arranged and connected in ways that allow you to explore. Think about the way Wikipedia works when you're hopping from Bolshevism to Celestial Mechanics to Dunbar's Number. It's hyperlinking at it's best. You get to actively choose which curiosity trail to follow, rather than defaulting to the algorithmically-filtered ephemeral stream. The garden helps us move away from time-bound streams and into contextual knowledge spaces.
Joel focused on the process of digital gardening, emphasising the slow growth of ideas through writing, rewriting, editing, and revising thoughts in public. Instead of slapping Fully Formed Opinions up on the web and never changing them.
However, many of these no-code tools still feel like cookie-cutter solutions. Rather than allowing people to design the information architecture and spatial layouts of their gardens, they inevitably force people into pre-made arrangements. This doesn't meant they don't "count,” as "real” gardens, but simply that they limit their gardeners to some extent. You can't design different types of links, novel features, experimental layouts, or custom architecture. They're pre-fab houses instead of raw building materials.
Gardens are organised around contextual relationships and associative links; the concepts and themes within each note determine how it's connected to others. This runs counter to the time-based structure of traditional blogs: posts presented in reverse chronological order based on publication date. Gardens don't consider publication dates the most important detail of a piece of writing. Dates might be included on posts, but they aren't the structural basis of how you navigate around the garden. Posts are connected to other by posts through related themes, topics, and shared context.
Gardens are never finished, they're constantly growing, evolving, and changing. Just like a real soil, carrot, and cabbage garden. The isn't how we usually think about writing on the web. Over the last decade, we've moved away from casual live journal entries and formalised our writing into articles and essays. These are carefully crafted, edited, revised, and published with a timestamp. When it's done, it's done. We act like tiny magazines, sending our writing off to the printer. This is odd considering editability is one of the main selling points of the web. Gardens lean into this – there is no "final version” on a garden. What you publish is always open to revision and expansion.
You're freed from the pressure to get everything right immediately. You can test ideas, get feedback, and revise your opinions like a good internet citizen. It's low friction. Gardening your thoughts becomes a daily ritual that only takes a small amount of effort. Over time, big things grow. It gives readers an insight into your writing and thinking process. They come to realise you are not a magical idea machine banging out perfectly formed thoughts, but instead an equally mediocre human doing The Work of trying to understand the world and make sense of it alongside you.
Gardens are imperfect by design. They don't hide their rough edges or claim to be a permanent source of truth. Putting anything imperfect and half-written on an "official website” may feel strange. We seem to reserve all our imperfect declarations and poorly-worded announcements for platforms that other people own and control. We have all been trained to behave like tiny, performative corporations when it comes to presenting ourselves in digital space. Blogging evolved in the Premium Mediocre culture of Millenialism as a way to Promote Your Personal Brand™ and market your SEO-optimized Content. Weird, quirky personal blogs of the early 2000's turned into cleanly crafted brands with publishing strategies and media campaigns. Everyone now has a modern minimalist logo and an LLC. Digital gardening is the Domestic Cozy response to the professional personal blog; it's both intimate and public, weird and welcoming. It's less performative than a blog, but more intentional and thoughtful than a Twitter feed. It wants to build personal knowledge over time, rather than engage in banter and quippy conversations.
If you give it a bit of forethought, you can build your garden in a way that makes it easy to transfer and adapt. Platforms and technologies will inevitably change. Using old-school, reliable, and widely used web native formats like HTML/CSS is a safe bet. Backing up your notes as flat markdown files won't hurt either.
·maggieappleton.com·
A Brief History & Ethos of the Digital Garden