Found 121 bookmarks
Newest
Fandom's Great Divide
Fandom's Great Divide
The 1970s sitcom "All in the Family" sparked debates with its bigoted-yet-lovable Archie Bunker character, leaving audiences divided over whether the show was satirizing prejudice or inadvertently promoting it, and reflecting TV's power to shape societal attitudes.
This sort of audience divide, not between those who love a show and those who hate it but between those who love it in very different ways, has become a familiar schism in the past fifteen years, during the rise of—oh, God, that phrase again—Golden Age television. This is particularly true of the much lauded stream of cable “dark dramas,” whose protagonists shimmer between the repulsive and the magnetic. As anyone who has ever read the comments on a recap can tell you, there has always been a less ambivalent way of regarding an antihero: as a hero
a subset of viewers cheered for Walter White on “Breaking Bad,” growling threats at anyone who nagged him to stop selling meth. In a blog post about that brilliant series, I labelled these viewers “bad fans,” and the responses I got made me feel as if I’d poured a bucket of oil onto a flame war from the parapets of my snobby critical castle. Truthfully, my haters had a point: who wants to hear that they’re watching something wrong?
·newyorker.com·
Fandom's Great Divide
Competition is overrated - cdixon
Competition is overrated - cdixon
That other people tried your idea without success could imply it’s a bad idea or simply that the timing or execution was wrong. Distinguishing between these cases is hard and where you should apply serious thought. If you think your competitors executed poorly, you should develop a theory of what they did wrong and how you’ll do better.
If you think your competitor’s timing was off, you should have a thesis about what’s changed to make now the right time. These changes could come in a variety of forms: for example, it could be that users have become more sophisticated, the prices of key inputs have dropped, or that prerequisite technologies have become widely adopted.
Startups are primarly competing against indifference, lack of awareness, and lack of understanding — not other startups.
There were probably 50 companies that tried to do viral video sharing before YouTube. Before 2005, when YouTube was founded, relatively few users had broadband and video cameras. YouTube also took advantage of the latest version of Flash that could play videos seamlessly.
Google and Facebook launched long after their competitors, but executed incredibly well and focused on the right things. When Google launched, other search engines like Yahoo, Excite, and Lycos were focused on becoming multipurpose “portals” and had de-prioritized search (Yahoo even outsourced their search technology).
·cdixon.org·
Competition is overrated - cdixon
Muse retrospective by Adam Wiggins
Muse retrospective by Adam Wiggins
  • Wiggins focused on storytelling and brand-building for Muse, achieving early success with an email newsletter, which helped engage potential users and refine the product's value proposition.
  • Muse aspired to a "small giants" business model, emphasizing quality, autonomy, and a healthy work environment over rapid growth. They sought to avoid additional funding rounds by charging a prosumer price early on.
  • Short demo videos on Twitter showcasing the app in action proved to be the most effective method for attracting new users.
Muse as a brand and a product represented something aspirational. People want to be deeper thinkers, to be more strategic, and to use cool, status-quo challenging software made by small passionate teams. These kinds of aspirations are easier to indulge in times of plenty. But once you're getting laid off from your high-paying tech job, or struggling to raise your next financing round, or scrambling to protect your kids' college fund from runaway inflation and uncertain markets... I guess you don't have time to be excited about cool demos on Twitter and thoughtful podcasts on product design.
I’d speculate that another factor is the half-life of cool new productivity software. Evernote, Slack, Notion, Roam, Craft, and many others seem to get pretty far on community excitement for their first few years. After that, I think you have to be left with software that serves a deep and hard-to-replace purpose in people’s lives. Muse got there for a few thousand people, but the economics of prosumer software means that just isn’t enough. You need tens of thousands, hundreds of thousands, to make the cost of development sustainable.
We envisioned Muse as the perfect combination of the freeform elements of a whiteboard, the structured text-heavy style of Notion or Google Docs, and the sense of place you get from a “virtual office” ala group chat. As a way to asynchronously trade ideas and inspiration, sketch out project ideas, and explore possibilities, the multiplayer Muse experience is, in my honest opinion, unparalleled for small creative teams working remotely.
But friction began almost immediately. The team lead or organizer was usually the one bringing Muse to the team, and they were already a fan of its approach. But the other team members are generally a little annoyed to have to learn any new tool, and Muse’s steeper learning curve only made that worse. Those team members would push the problem back to the team lead, treating them as customer support (rather than contacting us directly for help). The team lead often felt like too much of the burden of pushing Muse adoption was on their shoulders. This was in addition to the obvious product gaps, like: no support for the web or Windows; minimal or no integration with other key tools like Notion and Google Docs; and no permissions or support for multiple workspaces. Had we raised $10M back during the cash party of 2020–2021, we could have hired the 15+ person team that would have been necessary to build all of that. But with only seven people (we had added two more people to the team in 2021–2022), it just wasn’t feasible.
We neither focused on a particular vertical (academics, designers, authors...) or a narrow use case (PDF reading/annotation, collaborative whiteboarding, design sketching...). That meant we were always spread pretty thin in terms of feature development, and marketing was difficult even over and above the problem of explaining canvas software and digital thinking tools.
being general-purpose was in its blood from birth. Part of it was maker's hubris: don't we always dream of general-purpose tools that will be everything to everyone? And part of it was that it's truly the case that Muse excels at the ability to combine together so many different related knowledge tasks and media types into a single, minimal, powerful canvas. Not sure what I would do differently here, even with the benefit of hindsight.
Muse built a lot of its reputation on being principled, but we were maybe too cautious to do the mercenary things that help you succeed. A good example here is asking users for ratings; I felt like this was not to user benefit and distracting when the user is trying to use your app. Our App Store rating was on the low side (~3.9 stars) for most of our existence. When we finally added the standard prompt-for-rating dialog, it instantly shot up to ~4.7 stars. This was a small example of being too principled about doing good for the user, and not thinking about what would benefit our business.
Growing the team slowly was a delight. At several previous ventures, I've onboard people in the hiring-is-job-one environment of a growth startup. At Muse, we started with three founders and then hired roughly one person per year. This was absolutely fantastic for being able to really take our time to find the perfect person for the role, and then for that person to have tons of time to onboard and find their footing on the team before anyone new showed up. The resulting team was the best I've ever worked on, with minimal deadweight or emotional baggage.
ultimately your product does have to have some web presence. My biggest regret is not building a simple share-to-web function early on, which could have created some virality and a great deal of utility for users as well.
In terms of development speed, quality of the resulting product, hardware integration, and a million other things: native app development wins.
After decades working in product development, being on the marketing/brand/growth/storytelling side was a huge personal challenge for me. But I feel like I managed to grow into the role and find my own approach (podcasting, demo videos, etc) to create a beacon to attract potential customers to our product.
when it comes time for an individual or a team to sit down and sketch out the beginnings of a new business, a new book, a new piece of art—this almost never happens at a computer. Or if it does, it’s a cobbled-together collection of tools like Google Docs and Zoom which aren’t really made for this critical part of the creative lifecycle.
any given business will find a small number of highly-effective channels, and the rest don't matter. For Heroku, that was attending developer conferences and getting blog posts on Hacker News. For another business it might be YouTube influencer sponsorships and print ads in a niche magazine. So I set about systematically testing many channels.
·adamwiggins.com·
Muse retrospective by Adam Wiggins
Strong and weak technologies - cdixon
Strong and weak technologies - cdixon
Strong technologies capture the imaginations of technology enthusiasts. That is why many important technologies start out as weekend hobbies. Enthusiasts vote with their time, and, unlike most of the business world, have long-term horizons. They build from first principles, making full use of the available resources to design technologies as they ought to exist.
·cdixon.org·
Strong and weak technologies - cdixon
Tools for Thought as Cultural Practices, not Computational Objects
Tools for Thought as Cultural Practices, not Computational Objects
Summary: Throughout human history, innovations like written language, drawing, maps, the scientific method, and data visualization have profoundly expanded the kinds of thoughts humans can think. Most of these "tools for thought" significantly predate digital computers. The modern usage of the phrase is heavily influenced by the work of computer scientists and technologists in the 20th century who envisioned how computers could become tools to extend human reasoning and help solve complex problems. While computers are powerful "meta-mediums", the current focus on building note-taking apps is quite narrow. To truly expand human cognition, we should explore a wider range of tools and practices, both digital and non-digital.
Taken at face value, the phrase tool for thought doesn't have the word 'computer' or 'digital' anywhere in it. It suggests nothing about software systems or interfaces. It's simply meant to refer to tools that help humans think thoughts; potentially new, different, and better kinds of thoughts than we currently think.
Most of the examples I listed above are cultural practices and techniques. They are primary ways of doing; specific ways of thinking and acting that result in greater cognitive abilities. Ones that people pass down from generation to generation through culture. Every one of these also pre-dates digital computers by at least a few hundred years, if not thousands or tens of thousands. Given that framing, it's time to return to the question of how computation, software objects, and note-taking apps fit into this narrative.
If you look around at the commonly cited “major thinkers” in this space, you get a list of computer programmers: Kenneth Iverson, J.C.R. Licklider, Vannevar Bush, Alan Kay, Bob Taylor, Douglas Englebart, Seymour Papert, Bret Victor, and Howard Rheingold, among others.
This is relevant because it means these men share a lot of the same beliefs, values, and context. They know the same sorts of people, learned the same historical stories in school and were taught to see the world in particular kinds of ways. Most of them worked together, or are at most one personal connection away from the next. Tools for thought is a community scene as much as it's a concept. This gives tools for thought a distinctly computer-oriented, male, American, middle-class flavour. The term has always been used in relation to a dream that is deeply intertwined with digital machines, white-collar knowledge work, and bold American optimism.
Englebart was specifically concerned with our ability to deal with complex problems, rather than simply “amplifying intelligence.” Being able to win a chess match is perceived as intelligent, but it isn't helping us tackle systemic racism or inequality. Englebart argued we should instead focus on “augmenting human intellect” in ways that help us find solutions to wicked problems. While he painted visions of how computers could facilitate this, he also pointed to organisational structures, system dynamics, and effective training as part of this puzzle.
There is a rich literature of research and insight into how we might expand human thought that sometimes feels entirely detached from the history we just covered. Cognitive scientists and philosophers have been tackling questions about the relationship between cognition, our tools, and our physical environments for centuries. Well before microprocessors and hypertext showed up. Oddly, they're rarely cited by the computer scientists. This alternate intellectual lineage is still asking the question “how can we develop better tools for thinking?” But they don't presume the answer revolves around computers.
Proponents of embodied cognition argue that our perceptions, concepts, and cognitive processes are shaped by the physical structures of our body and the sensory experiences it provides, and that cognition cannot be fully understood without considering the bodily basis of our experiences.
Philosopher Andy Clark has spent his career exploring how external tools transform and expand human cognition. His 2003 book Natural-born Cyborgs argues humans have “always been cyborgs.” Not in the sense of embedding wires into our flesh, but in the sense we enter “into deep and complex relationships with nonbiological constructs, props, and aids”. Our ability to think with external objects is precisely what makes us intelligent. Clark argues “the mind” isn't simply a set of functions within the brain, but a process that happens between our bodies and the physical environment. Intelligence emerges at the intersection of humans and tools. He expanded on this idea in a follow-on book called Supersizing the Mind. It became known as the extended mind hypothesis. It's the strong version of theories like embodied cognition, situated cognition, and enacted cognition that are all the rage in cognitive science departments.
There's a scramble to make sense of all these new releases and the differences between them. YouTube and Medium explode with DIY guides, walkthrough tours, and comparison videos. The productivity and knowledge management influencer is born.[ giant wall of productivity youtube nonsense ]The strange thing is, many of these guides are only superficially about the application they're presented in. Most are teaching specific cultural techniques
Zettelkasten, spaced repetition, critical thinking.These techniques are only focused on a narrow band of human activity. Specifically, activity that white-collar knowledge workers engage in.I previously suggested we should rename TFT to CMFT (computational mediums for thought), but that doesn't go far enough. If we're being honest about our current interpretation of TFT's, we should actually rename it to CMFWCKW – computational mediums for white-collar knowledge work.
By now it should be clear that this question of developing better tools for thought can and should cover a much wider scope than developing novel note-taking software.
I do think there's a meaningful distinction between tools and mediums: Mediums are a means of communicating a thought or expressing an idea. Tools are a means of working in a medium. Tools enable specific tasks and workflows within a medium. Cameras are a tool that let people express ideas through photography. Blogs are a tool that lets people express ideas through written language. JavaScript is a tool that let people express ideas through programming. Tools and mediums require each other. This makes lines between them fuzzy.
·maggieappleton.com·
Tools for Thought as Cultural Practices, not Computational Objects
The Mac Turns Forty – Pixel Envy
The Mac Turns Forty – Pixel Envy
As for a Hall of Shame thing? That would be the slow but steady encroachment of single-window applications in MacOS, especially via Catalyst and Electron. The reason I gravitated toward MacOS in the first place is the same reason I continue to use it: it fits my mental model of how an operating system ought to work.
·pxlnv.com·
The Mac Turns Forty – Pixel Envy
Why Did I Leave Google Or, Why Did I Stay So Long? - LinkedIn
Why Did I Leave Google Or, Why Did I Stay So Long? - LinkedIn
If I had to summarize it, I would say that the signal to noise ratio is what wore me down. We start companies to build products that serve people, not to sit in meetings with lawyers.  You need to be able to answer the "what have I done for our users today" question with "not much but I got promoted" and be happy with that answer to be successful in Corp-Tech.
being part of a Corporation means that the signal to noise ratio changes dramatically.  The amount of time and effort spent on Legal, Policy, Privacy - on features that have not shipped to users yet, meant a significant waste of resources and focus. After the acquisition, we have an extremely long project that consumed many of our best engineers to align our data retention policies and tools to Google. I am not saying this is not important BUT this had zero value to our users. An ever increasing percent of our time went to non user value creation tasks and that changes the DNA of the company quickly, from customer focused to corporate guidelines focused.
the salaries are so high and the options so valuable that it creates many misalignments.  The impact of an individual product on the Corp-Tech stock is minimal so equity is basically free money.  Regardless of your performance (individually) or your product performance, you equity grows significantly so nothing you do has real economic impact on your family. The only control you have to increase your economic returns are whether you get promoted, since that drives your equity and salary payments.  This breaks the traditional tech model of risk reward.
·linkedin.com·
Why Did I Leave Google Or, Why Did I Stay So Long? - LinkedIn
Scarlet Witch - Wikipedia
Scarlet Witch - Wikipedia
Marvel licensed the filming rights of the X-Men and related concepts, such as mutants, to 20th Century Fox, who created a film series based on the franchise. Years later, Marvel started their own film franchise, known as the Marvel Cinematic Universe (MCU), which focused on characters that they had not licensed to other studios (see below). At the time, the rights to Quicksilver and Scarlet Witch were disputed by both studios. As they both held the rights to the characters, with Fox citing the characters' mutant status and being children of Magneto and Marvel citing the twins' editorial history being more closely tied to the Avengers rather than the X-Men, the studios made an agreement wherein both of them could use the characters on the condition that the plots did not make reference to the other studio's properties (i.e. the Fox films could not mention the twins as members of the Avengers while the MCU could not mention them as mutants or children of Magneto).[215] The arrangement became moot following the acquisition of 21st Century Fox by Disney – the parent company of Marvel Studios, and the confirmation that future X-Men films will take place within the MCU.
·en.wikipedia.org·
Scarlet Witch - Wikipedia
What I learned getting acquired by Google
What I learned getting acquired by Google
While there were undoubtedly people who came in for the food, worked 3 hours a day, and enjoyed their early retirements, all the people I met were earnest, hard-working, and wanted to do great work. What beat them down were the gauntlet of reviews, the frequent re-orgs, the institutional scar tissue from past failures, and the complexity of doing even simple things on the world stage. Startups can afford to ignore many concerns, Googlers rarely can. What also got in the way were the people themselves - all the smart people who could argue against anything but not for something, all the leaders who lacked the courage to speak the uncomfortable truth, and all the people that were hired without a clear project to work on, but must still be retained through promotion-worthy made-up work.
Another blocker to progress that I saw up close was the imbalance of a top heavy team. A team with multiple successful co-founders and 10-20 year Google veterans might sound like a recipe for great things, but it’s also a recipe for gridlock. This structure might work if there are multiple areas to explore, clear goals, and strong autonomy to pursue those paths.
Good teams regularly pay down debt by cleaning things up on quieter days. Just as real is process debt. A review added because of a launch gone wrong. A new legal check to guard against possible litigation. A section added to a document template. Layers accumulate over the years until you end up unable to release a new feature for months after it's ready because it's stuck between reviews, with an unclear path out.
·shreyans.org·
What I learned getting acquired by Google
Omegle's Rise and Fall - A Vision for Internet Connection
Omegle's Rise and Fall - A Vision for Internet Connection
As much as I wish circumstances were different, the stress and expense of this fight – coupled with the existing stress and expense of operating Omegle, and fighting its misuse – are simply too much. Operating Omegle is no longer sustainable, financially nor psychologically. Frankly, I don’t want to have a heart attack in my 30s. The battle for Omegle has been lost, but the war against the Internet rages on. Virtually every online communication service has been subject to the same kinds of attack as Omegle; and while some of them are much larger companies with much greater resources, they all have their breaking point somewhere. I worry that, unless the tide turns soon, the Internet I fell in love with may cease to exist, and in its place, we will have something closer to a souped-up version of TV – focused largely on passive consumption, with much less opportunity for active participation and genuine human connection.
I’ve done my best to weather the attacks, with the interests of Omegle’s users – and the broader principle – in mind. If something as simple as meeting random new people is forbidden, what’s next? That is far and away removed from anything that could be considered a reasonable compromise of the principle I outlined. Analogies are a limited tool, but a physical-world analogy might be shutting down Central Park because crime occurs there – or perhaps more provocatively, destroying the universe because it contains evil. A healthy, free society cannot endure when we are collectively afraid of each other to this extent.
In recent years, it seems like the whole world has become more ornery. Maybe that has something to do with the pandemic, or with political disagreements. Whatever the reason, people have become faster to attack, and slower to recognize each other’s shared humanity. One aspect of this has been a constant barrage of attacks on communication services, Omegle included, based on the behavior of a malicious subset of users. To an extent, it is reasonable to question the policies and practices of any place where crime has occurred. I have always welcomed constructive feedback; and indeed, Omegle implemented a number of improvements based on such feedback over the years. However, the recent attacks have felt anything but constructive. The only way to please these people is to stop offering the service. Sometimes they say so, explicitly and avowedly; other times, it can be inferred from their act of setting standards that are not humanly achievable. Either way, the net result is the same.
I didn’t really know what to expect when I launched Omegle. Would anyone even care about some Web site that an 18 year old kid made in his bedroom in his parents’ house in Vermont, with no marketing budget? But it became popular almost instantly after launch, and grew organically from there, reaching millions of daily users. I believe this had something to do with meeting new people being a basic human need, and with Omegle being among the best ways to fulfill that need. As the saying goes: “If you build a better mousetrap, the world will beat a path to your door.” Over the years, people have used Omegle to explore foreign cultures; to get advice about their lives from impartial third parties; and to help alleviate feelings of loneliness and isolation. I’ve even heard stories of soulmates meeting on Omegle, and getting married. Those are only some of the highlights. Unfortunately, there are also lowlights. Virtually every tool can be used for good or for evil, and that is especially true of communication tools, due to their innate flexibility. The telephone can be used to wish your grandmother “happy birthday”, but it can also be used to call in a bomb threat. There can be no honest accounting of Omegle without acknowledging that some people misused it, including to commit unspeakably heinous crimes.
As a young teenager, I couldn’t just waltz onto a college campus and tell a student: “Let’s debate moral philosophy!” I couldn’t walk up to a professor and say: “Tell me something interesting about microeconomics!” But online, I was able to meet those people, and have those conversations. I was also an avid Wikipedia editor; I contributed to open source software projects; and I often helped answer computer programming questions posed by people many years older than me. In short, the Internet opened the door to a much larger, more diverse, and more vibrant world than I would have otherwise been able to experience; and enabled me to be an active participant in, and contributor to, that world. All of this helped me to learn, and to grow into a more well-rounded person. Moreover, as a survivor of childhood rape, I was acutely aware that any time I interacted with someone in the physical world, I was risking my physical body. The Internet gave me a refuge from that fear. I was under no illusion that only good people used the Internet; but I knew that, if I said “no” to someone online, they couldn’t physically reach through the screen and hold a weapon to my head, or worse. I saw the miles of copper wires and fiber-optic cables between me and other people as a kind of shield – one that empowered me to be less isolated than my trauma and fear would have otherwise allowed.
·omegle.com·
Omegle's Rise and Fall - A Vision for Internet Connection
Why corporate America broke up with design
Why corporate America broke up with design
Design thinking alone doesn't determine market success, nor does it always transform business as expected.
There are a multitude of viable culprits behind this revenue drop. Robson himself pointed to the pandemic and tightened global budgets while arguing that “the widespread adoption of design thinking . . . has reduced demand for our services.” (Ideo was, in part, its own competition here since for years, it sold courses on design thinking.) It’s perhaps worth noting that, while design thinking was a buzzword from the ’90s to the early 2010s, it’s commonly met with all sorts of criticism today.
“People were like, ‘We did the process, why doesn’t our business transform?'” says Cliff Kuang, a UX designer and coauthor of User Friendly (and a former Fast Company editor). He points to PepsiCo, which in 2012 hired its first chief design officer and opened an in-house design studio. The investment has not yielded a string of blockbusters (and certainly no iPhone for soda). One widely promoted product, Drinkfinity, attempted to respond to diminishing soft-drink sales with K-Cup-style pods and a reusable water bottle. The design process was meticulous, with extensive prototyping and testing. But Drinkfinity had a short shelf life, discontinued within two years of its 2018 release.
“Design is rarely the thing that determines whether something succeeds in the market,” Kuang says. Take Amazon’s Kindle e-reader. “Jeff Bezos henpecked the original Kindle design to death. Because he didn’t believe in capacitive touch, he put a keyboard on it, and all this other stuff,” Kuang says. “Then the designer of the original Kindle walked and gave [the model] to Barnes & Noble.” Barnes & Noble released a product with a superior physical design, the Nook. But design was no match for distribution. According to the most recent data, Amazon owns approximately 80% of the e-book market share.
The rise of mobile computing has forced companies to create effortless user experiences—or risk getting left behind. When you hail an Uber or order toilet paper in a single click, you are reaping the benefits of carefully considered design. A 2018 McKinsey study found that companies with the strongest commitment to design and the best execution of design principles had revenue that was 32 percentage points higher—and shareholder returns that were 56 percentage points higher—than other companies.
·fastcompany.com·
Why corporate America broke up with design
the internet is one big video game
the internet is one big video game
New real-time syncing libraries like Partykit (and my inspired creation playhtml) are making it incredibly easy to make websites multiplayer, which many games incorporate as the default. This prediction is wise in a lot of ways in terms of interaction, narrative, tutorial, and multiplayer design, and more and more people desire a liveness and tactility in websites that we take for granted in video games.
Websites are the future of video games. They are the “end game” of video games. They are spaces where the end players (the website visitors) have the agency to freely interact with others, and not towards any predetermined object, but purely for themselves, discovering who they are in each new environment and finding new ways of relating to one another.
Tokimeki Memorial gives the impression where your agency comes into conflict with several others’, each with their own desires and personalities. At the end of this season, he concludes that more video games should ditch combat mechanics and instead focus on how your choice of actions question and ultimately shape who you are and what you care about.
As I watch Tim talk about all this, I think about how websites feel like multiplayer video games, all of which are part of the broader “internet” universe. One in which the “creatures” are the cursors of other, real people. And where we can’t fight each other at all, only talk to one another.
Somewhere in the push to make the internet the infrastructure of a global capitalist economy, we lost this perspective on what the internet is. If I asked people to define what websites are to them, they might talk about the capabilities they provide: “the world’s information at your fingertips,” “AI that does whatever you ask of it,” “a platform for selling products.” Or as design artifacts: they provide the basis of interactive, creative pieces of art, media, and writing. But if we distill a website down to its base components, it is a space that allows people to talk to each other. In the era when the internet was new and before we had predetermined what it was “for,” everyday internet pioneers found ways to talk to one another by making websites for each other. The conversations spanned webs of personal websites, revealing intimate detail in exchange for intimate detail. They bartered histories for kinship, stories for solidarity, identities for community.
The websites of our modern-day internet experience reflect quite a different perspective on what websites should be “for.” Websites are often the expression of a corporate unit, optimized for flow, retention, or the latest trendy design aesthetic. We focus on animation design and gradient layering rather than the interactions that govern how we relate to one another.
How do we make websites feel more like embodied objects? What does a website that can become well-worn or passed down feel like? How does a website become a living gathering space, one that evolves with the activity of its participants? How can a website enable showing care to each other? How can it facilitate solidarity between people?
As video games have shifted towards hyper-optimization, the internet has gone a similar direction. Friction has been systematically eliminated and sophisticated automated experimentation infrastructure enables optimization of key metrics at a microscopic level of detail. In return, we’ve come to view websites and the broader internet more and more as a purely utilitarian medium. Even social media, which at some point was positioned as something for self-expression and community-making has become almost entirely a space for influence climbing.
We need more websites that gently guide us to trust our own choices and intuitions, that chide us when we try to do it all and work ourselves to the bone, that nudge us to find beauty in unexpected places, to find the poetry in the lazy.
·spencers.cafe·
the internet is one big video game
A Brief History & Ethos of the Digital Garden
A Brief History & Ethos of the Digital Garden
Rather than presenting a set of polished articles, displayed in reverse chronological order, these sites act more like free form, work-in-progress wikis. A garden is a collection of evolving ideas that aren't strictly organised by their publication date. They're inherently exploratory – notes are linked through contextual associations. They aren't refined or complete - notes are published as half-finished thoughts that will grow and evolve over time. They're less rigid, less performative, and less perfect than the personal websites we're used to seeing.
It harkens back to the early days of the web when people had fewer notions of how websites "should be.” It's an ethos that is both classically old and newly imagined.
digital gardening is not about specific tools – it's not a Wordpress plugin, Gastby theme, or Jekyll template. It's a different way of thinking about our online behaviour around information - one that accumulates personal knowledge over time in an explorable space.
Gardens present information in a richly linked landscape that grows slowly over time. Everything is arranged and connected in ways that allow you to explore. Think about the way Wikipedia works when you're hopping from Bolshevism to Celestial Mechanics to Dunbar's Number. It's hyperlinking at it's best. You get to actively choose which curiosity trail to follow, rather than defaulting to the algorithmically-filtered ephemeral stream. The garden helps us move away from time-bound streams and into contextual knowledge spaces.
Joel focused on the process of digital gardening, emphasising the slow growth of ideas through writing, rewriting, editing, and revising thoughts in public. Instead of slapping Fully Formed Opinions up on the web and never changing them.
However, many of these no-code tools still feel like cookie-cutter solutions. Rather than allowing people to design the information architecture and spatial layouts of their gardens, they inevitably force people into pre-made arrangements. This doesn't meant they don't "count,” as "real” gardens, but simply that they limit their gardeners to some extent. You can't design different types of links, novel features, experimental layouts, or custom architecture. They're pre-fab houses instead of raw building materials.
Gardens are organised around contextual relationships and associative links; the concepts and themes within each note determine how it's connected to others. This runs counter to the time-based structure of traditional blogs: posts presented in reverse chronological order based on publication date. Gardens don't consider publication dates the most important detail of a piece of writing. Dates might be included on posts, but they aren't the structural basis of how you navigate around the garden. Posts are connected to other by posts through related themes, topics, and shared context.
Gardens are never finished, they're constantly growing, evolving, and changing. Just like a real soil, carrot, and cabbage garden. The isn't how we usually think about writing on the web. Over the last decade, we've moved away from casual live journal entries and formalised our writing into articles and essays. These are carefully crafted, edited, revised, and published with a timestamp. When it's done, it's done. We act like tiny magazines, sending our writing off to the printer. This is odd considering editability is one of the main selling points of the web. Gardens lean into this – there is no "final version” on a garden. What you publish is always open to revision and expansion.
You're freed from the pressure to get everything right immediately. You can test ideas, get feedback, and revise your opinions like a good internet citizen. It's low friction. Gardening your thoughts becomes a daily ritual that only takes a small amount of effort. Over time, big things grow. It gives readers an insight into your writing and thinking process. They come to realise you are not a magical idea machine banging out perfectly formed thoughts, but instead an equally mediocre human doing The Work of trying to understand the world and make sense of it alongside you.
Gardens are imperfect by design. They don't hide their rough edges or claim to be a permanent source of truth. Putting anything imperfect and half-written on an "official website” may feel strange. We seem to reserve all our imperfect declarations and poorly-worded announcements for platforms that other people own and control. We have all been trained to behave like tiny, performative corporations when it comes to presenting ourselves in digital space. Blogging evolved in the Premium Mediocre culture of Millenialism as a way to Promote Your Personal Brand™ and market your SEO-optimized Content. Weird, quirky personal blogs of the early 2000's turned into cleanly crafted brands with publishing strategies and media campaigns. Everyone now has a modern minimalist logo and an LLC. Digital gardening is the Domestic Cozy response to the professional personal blog; it's both intimate and public, weird and welcoming. It's less performative than a blog, but more intentional and thoughtful than a Twitter feed. It wants to build personal knowledge over time, rather than engage in banter and quippy conversations.
If you give it a bit of forethought, you can build your garden in a way that makes it easy to transfer and adapt. Platforms and technologies will inevitably change. Using old-school, reliable, and widely used web native formats like HTML/CSS is a safe bet. Backing up your notes as flat markdown files won't hurt either.
·maggieappleton.com·
A Brief History & Ethos of the Digital Garden
In an ugly world, vaccines are a beautiful gift worth honouring
In an ugly world, vaccines are a beautiful gift worth honouring
nice words on vaccines
Vaccines are not only immensely useful; they also embody something beautifully human in their combination of care and communication. Vaccines do not trick the immune system, as is sometimes said; they educate and train it. As a resource of good public health, they allow doctors to whisper words of warning into the cells of their patients. In an age short of trust, this intimacy between government policy and an individual’s immune system is easily misconstrued as a threat. But vaccines are not conspiracies or tools of control: they are molecular loving-kindness.
·economist.com·
In an ugly world, vaccines are a beautiful gift worth honouring