Found 37 bookmarks
Newest
Notes on “Taste” | Are.na Editorial
Notes on “Taste” | Are.na Editorial
Taste has historically been reserved for conversation about things like fashion and art. Now, we look for it in our social media feeds, the technology we use, the company we keep, and the people we hire.
When I ask people what they mean by “taste,” they’ll stumble around for a bit and eventually land on something like “you know it when you see it,” or “it’s in the eye of the beholder.” I understand. Words like taste are hard to pin down, perhaps because they describe a sensibility more than any particular quality, a particular thing. We’re inclined to leave them unencumbered by a definition, to preserve their ability to shift shapes.
’ve found a taste-filled life to be a richer one. To pursue it is to appreciate ourselves, each other, and the stuff we’re surrounded by a whole lot more.
I can’t think of a piece of writing that does this more effectively than Susan Sontag’s “Notes on ‘Camp.’” In her words, “a sensibility is one of the hardest things to talk about... To snare a sensibility in words, especially one that is alive and powerful, one must be tentative and nimble.
Things don’t feel tasteful, they demonstrate taste. Someone’s home can be decorated tastefully. Someone can dress tastefully. The vibe cannot be tasteful. The experience cannot be tasteful.
Someone could have impeccable taste in art, without producing any themselves. Those who create tasteful things are almost always deep appreciators, though.
we typically talk about it in binaries. One can have taste or not. Great taste means almost the same thing as taste.
They’re the people you always go to for restaurant or movie or gear recommendations. Maybe it’s the person you ask to be an extra set of eyes on an email or a project brief before you send it out.
It requires intention, focus, and care. Taste is a commitment to a state of attention.
As John Saltivier says in an essay about building a set of stairs, “surprising detail is a near universal property of getting up close and personal with reality.”
To quote Susan Sontag again, “There is taste in people, visual taste, taste in emotion — and there is taste in acts, taste in morality. Intelligence, as well, is really a kind of taste: taste in ideas. One of the facts to be reckoned with is that taste tends to develop very unevenly. It's rare that the same person has good visual taste and good taste in people and taste in ideas.” The sought-after interior designer may not mind gas station coffee. The prolific composer may not give a damn about how they dress.
Taste in too many things would be tortuous. The things we have taste in often start as a pea under the mattress.
it is often formed through the integration of diverse, and wide-ranging inputs. Steve Jobs has said, “I think part of what made the Macintosh great was that the people working on it were musicians and poets and artists and zoologists and historians who also happened to be the best computer scientists in the world.”
taste gets you to the thing that’s more than just correct. Taste hits different. It intrigues. It compels. It moves. It enchants. It fascinates. It seduces.
Taste honors someone’s standards of quality, but also the distinctive way the world bounces off a person. It reflects what they know about how the world works, and also what they’re working with in their inner worlds. When we recognize  true taste, we are recognizing that alchemic combination of skill and soul. This is why it is so alluring.
many snobs (coffee snobs, gear snobs, wine snobs, etc.) often have great taste. But I would say that taste is the sensibility, and snobbery is one way to express the sensibility. It’s not the only way.
If rich people often have good taste it’s because they grew up around nice things, and many of them acquired an intolerance for not nice things as a result. That’s a good recipe for taste, but it’s not sufficient and it’s definitely not a guarantee. I know people that are exceedingly picky about the food they eat and never pay more than $20 for a meal.
creating forces taste upon its maker. Creators must master self-expression and craft if they’re going to make something truly compelling.
artists are more sensitive. They’re more observant, feel things more deeply, more obsessive about details, more focused on how they measure up to greatness.
Picasso remarking that “when art critics get together they talk about Form and Structure and Meaning. When artists get together they talk about where you can buy cheap turpentine.” Taste rests on turpentine.
the process of metabolizing the world is a slow one. Wield your P/N meter well, take your time learning what you find compelling, and why. There are no shortcuts to taste. Taste cannot sublimate. It can only bloom. To quote Susan Sontag one last time, “taste has no system and no proofs. But there is something like a logic of taste: the consistent sensibility which underlies and gives rise to a certain taste.
·are.na·
Notes on “Taste” | Are.na Editorial
AI Models in Software UI - LukeW
AI Models in Software UI - LukeW
In the first approach, the primary interface affordance is an input that directly (for the most part) instructs an AI model(s). In this paradigm, people are authoring prompts that result in text, image, video, etc. generation. These prompts can be sequential, iterative, or un-related. Marquee examples are OpenAI's ChatGPT interface or Midjourney's use of Discord as an input mechanism. Since there are few, if any, UI affordances to guide people these systems need to respond to a very wide range of instructions. Otherwise people get frustrated with their primarily hidden (to the user) limitations.
The second approach doesn't include any UI elements for directly controlling the output of AI models. In other words, there's no input fields for prompt construction. Instead instructions for AI models are created behind the scenes as people go about using application-specific UI elements. People using these systems could be completely unaware an AI model is responsible for the output they see.
The third approach is application specific UI with AI assistance. Here people can construct prompts through a combination of application-specific UI and direct model instructions. These could be additional controls that generate portions of those instructions in the background. Or the ability to directly guide prompt construction through the inclusion or exclusion of content within the application. Examples of this pattern are Microsoft's Copilot suite of products for GitHub, Office, and Windows.
they could be overlays, modals, inline menus and more. What they have in common, however, is that they supplement application specific UIs instead of completely replacing them.
·lukew.com·
AI Models in Software UI - LukeW
Ideo breaks its silence on design thinking’s critics
Ideo breaks its silence on design thinking’s critics
criticisms of design thinking discussed in an interview with Fast Company Innovation Festival, Ideo partner and leader of its Cambridge, Massachusetts, office Michael Hendrix
By Katharine Schwab4 minute ReadOver the last year, Ideo’s philosophy of “design thinking“–a codified, six-step process to solve problems creatively–has come under fire. It’s been called bullshit, the opposite of inclusive design, and a failed experiment. It’s even been compared to syphilis.Ideo as an institution has rarely responded to critiques of design thinking or acknowledged its flaws. But at the Fast Company Innovation Festival, Ideo partner and leader of its Cambridge, Massachusetts, office Michael Hendrix had a frank conversation with Co.Design senior writer Mark Wilson about why design thinking has gotten so much flack.“I think it’s fair to critique design thinking, just as it’s fair to critique any other design strategy,” Hendrix says. “There’s of course many poor examples of design thinking, and there’s great examples. Just like there’s poor examples of industrial design and graphic design and different processes within organizations.”Part of the problem is that many people use the design thinking methodology in superficial ways. Hendrix calls it the “theater of innovation.” Companies know they need to be more creative and innovative, and because they’re looking for fast ways to achieve those goals, they cut corners.“We get a lot of the materials that look like innovation, or look like they make us more creative,” Hendrix says. “That could be anything from getting a bunch of Sharpie markers and Post-its and putting them in rooms for brainstorms, to having new dress codes, to programming play into the week. They all could be good tools to serve up creativity or innovation, they all could be methods of design thinking, but without some kind of history or strategy to tie them together, and track their progress, track their impact, they end up being a theatrical thing that people can point to and say, ‘oh we did that.'”
“If you make something rigid and formulaic, it could absolutely fail,” he says. “You want to rely on milestones in the creative process, but you don’t want it to be a reactive process that loses its soul.”
“There is a real need to build respect for one another and trust in the safety of sharing ideas so you can move forward,” Hendrix says. “Knowing when to bring judgments is important. Cultures that are highly judgy, that have hierarchy, that are rewarding the person who is the smartest person in the room, don’t do well with this kind of methodology.”
·fastcompany.com·
Ideo breaks its silence on design thinking’s critics
Why corporate America broke up with design
Why corporate America broke up with design
Design thinking alone doesn't determine market success, nor does it always transform business as expected.
There are a multitude of viable culprits behind this revenue drop. Robson himself pointed to the pandemic and tightened global budgets while arguing that “the widespread adoption of design thinking . . . has reduced demand for our services.” (Ideo was, in part, its own competition here since for years, it sold courses on design thinking.) It’s perhaps worth noting that, while design thinking was a buzzword from the ’90s to the early 2010s, it’s commonly met with all sorts of criticism today.
“People were like, ‘We did the process, why doesn’t our business transform?'” says Cliff Kuang, a UX designer and coauthor of User Friendly (and a former Fast Company editor). He points to PepsiCo, which in 2012 hired its first chief design officer and opened an in-house design studio. The investment has not yielded a string of blockbusters (and certainly no iPhone for soda). One widely promoted product, Drinkfinity, attempted to respond to diminishing soft-drink sales with K-Cup-style pods and a reusable water bottle. The design process was meticulous, with extensive prototyping and testing. But Drinkfinity had a short shelf life, discontinued within two years of its 2018 release.
“Design is rarely the thing that determines whether something succeeds in the market,” Kuang says. Take Amazon’s Kindle e-reader. “Jeff Bezos henpecked the original Kindle design to death. Because he didn’t believe in capacitive touch, he put a keyboard on it, and all this other stuff,” Kuang says. “Then the designer of the original Kindle walked and gave [the model] to Barnes & Noble.” Barnes & Noble released a product with a superior physical design, the Nook. But design was no match for distribution. According to the most recent data, Amazon owns approximately 80% of the e-book market share.
The rise of mobile computing has forced companies to create effortless user experiences—or risk getting left behind. When you hail an Uber or order toilet paper in a single click, you are reaping the benefits of carefully considered design. A 2018 McKinsey study found that companies with the strongest commitment to design and the best execution of design principles had revenue that was 32 percentage points higher—and shareholder returns that were 56 percentage points higher—than other companies.
·fastcompany.com·
Why corporate America broke up with design
Synthography – An Invitation to Reconsider the Rapidly Changing Toolkit of Digital Image Creation as a New Genre Beyond Photography
Synthography – An Invitation to Reconsider the Rapidly Changing Toolkit of Digital Image Creation as a New Genre Beyond Photography
With the comprehensive application of Artificial Intelligence into the creation and post production of images, it seems questionable if the resulting visualisations can still be considered ‘photographs’ in a classical sense – drawing with light. Automation has been part of the popular strain of photography since its inception, but even the amateurs with only basic knowledge of the craft could understand themselves as author of their images. We state a legitimation crisis for the current usage of the term. This paper is an invitation to consider Synthography as a term for a new genre for image production based on AI, observing the current occurrence and implementation in consumer cameras and post-production.
·link.springer.com·
Synthography – An Invitation to Reconsider the Rapidly Changing Toolkit of Digital Image Creation as a New Genre Beyond Photography
Elegy for the Native Mac App
Elegy for the Native Mac App
Tracing a trendline from the start of the Mac apps platforms to the future of visionOS
In recent years Sketch’s Mac-ness has become a liability. Requiring every person in a large design organization to use a Mac is not an easy sell. Plus, a new generation of “internet native” users expect different things from their software than old-school Mac connoisseurs: Multiplayer editing, inline commenting, and cloud sync are now table-stakes for any successful creative app.
At the time of Sketch’s launch most UX designers were using Photoshop or Illustrator. Both were expensive and overwrought, and neither were actually created for UX design. Sketch’s innovation wasn’t any particular feature — if anything it was the lack of features. It did a few things really well, and those were exactly the things UX designers wanted. In that way it really embodied the Mac ethos: simple, single-purpose, and fun to use.
Apple pushed hard to attract artists, filmmakers, musicians, and other creative professionals. It started a virtuous cycle. More creatives using Macs meant more potential customers for creative Mac software, which meant more developers started building that software, which in turn attracted even more customers to the platform.And so the Mac ended up with an abundance of improbably-good creative tools. Usually these apps weren’t as feature-rich or powerful as their PC counterparts, but were faster and easier and cheaper and just overall more conducive to the creative process.
Apple is still very interested in selling Macs — precision-milled aluminum computers with custom-designed chips and “XDR” screens. But they no longer care much about The Mac: The operating system, the software platform, its design sensibilities, its unique features, its vibes.
The term-of-art for this style is “skeuomorphism”: modern designs inspired by their antecedents — calculator apps that look like calculators, password-entry fields that look like bank vaults, reminders that look like sticky notes, etc.This skeuomorphic playfulness made downloading a new Mac app delightful. The discomfort of opening a new unfamiliar piece of software was totally offset by the joy of seeing a glossy pixel-perfect rendition of a bookshelf or a bodega or a poker table, complete with surprising little animations.
There are literally dozens of ways to develop cross-platform apps, including Apple’s own Catalyst — but so far, none of these tools can create anything quite as polished as native implementations.So it comes down to user preference: Would you rather have the absolute best app experience, or do you want the ability to use an acceptably-functional app from any of your devices? It seems that users have shifted to prefer the latter.
Unfortunately the appeal of native Mac software was, at its core, driven by brand strategy. Mac users were sold on the idea that they were buying not just a device but an ecosystem, an experience. Apple extended this branding for third-party developers with its yearly Apple Design Awards.
for the first time since the introduction of the original Mac, they’re just computers. Yes, they were technically always “just computers”, but they used to feel like something bigger. Now Macs have become just another way, perhaps the best way, to use Slack or VSCode or Figma or Chrome or Excel.
visionOS’s story diverges from that of the Mac. Apple is no longer a scrappy upstart. Rather, they’re the largest company in the world by market cap. It’s not so much that Apple doesn’t care about indie developers anymore, it’s just that indie developers often end up as the ants crushed beneath Apple’s giant corporate feet.
I think we’ll see a lot of cool indie software for visionOS, but also I think most of it will be small utilities or toys. It takes a lot of effort to build and support apps that people rely on for their productivity or creativity. If even the wildly-popular Mac platform can’t support those kinds of projects anymore, what chance does a luxury headset have?
·medium.com·
Elegy for the Native Mac App
A brand is more than a logo or word-mark
A brand is more than a logo or word-mark
How they translate into 3D spaces, how they are integrated with architecture, lighting, textures & materials enables more avenues for brand expression, and often elevates the perception of a brand over time and exposure, even if the logo fades somewhat into the background.
·clipcontent.substack.com·
A brand is more than a logo or word-mark
Magic Ink - Information Software and the Graphical Interface
Magic Ink - Information Software and the Graphical Interface
A good industrial designer understands the capabilities and limitations of the human body in manipulating physical objects, and of the human mind in comprehending mechanical models. A camera designer, for example, shapes her product to fit the human hand. She places buttons such that they can be manipulated with index fingers while the camera rests on the thumbs, and weights the buttons so they can be easily pressed in this position, but won’t trigger on accident. Just as importantly, she designs an understandable mapping from physical features to functions—pressing a button snaps a picture, pulling a lever advances the film, opening a door reveals the film, opening another door reveals the battery.
When the software designer defines the interactive aspects of her program, when she places these pseudo-mechanical affordances and describes their behavior, she is doing a virtual form of industrial design. Whether she realizes it or not. #The software designer can thus approach her art as a fusion of graphic design and industrial design. Now, let’s consider how a user approaches software, and more importantly, why.
·worrydream.com·
Magic Ink - Information Software and the Graphical Interface
Exapt existing infrastructure
Exapt existing infrastructure
Here are the adoption curves for a handful of major technologies in the United States. There are big differences in the speeds at which these technologies were absorbed. Landline telephones took about 86 years to hit 80% adoption.Flush toilets took 96 years to hit 80% adoption.Refrigerators took about 25 years.Microwaves took 17 years.Smartphones took just 12 years.Why these wide differences in adoption speed? Conformability with existing infrastructure. Flush toilets required the build-out of water and sewage utility systems. They also meant adding a new room to the house—the bathroom—and running new water and sewage lines underneath and throughout the house. That’s a lot of systems to line up. By contrast, refrigerators replaced iceboxes, and could fit into existing kitchens without much work. Microwaves could sit on a countertop. Smartphones could slip into your pocket.
·subconscious.substack.com·
Exapt existing infrastructure
Optimizing For Feelings
Optimizing For Feelings
Humor us for a moment and picture your favorite neighborhood restaurant. Ours is a corner spot in Fort Greene, Brooklyn. It has overflowing natural light, handmade textile seat cushions, a caramel wood grain throughout, and colorful ornaments dangling from the ceilings. Can you picture yours? Do you feel the warmth and spirit of the place?A Silicon Valley optimizer might say, “Well, they don’t brew their coffee at exactly 200 degrees. And the seats look a little ratty. And the ceiling ornaments don’t serve any function.”But we think that’s exactly the point. That these little, hand-crafted touches give our environment its humanity and spirit. In their absence, we’re left with something universal but utterly sterile — a space that may “perfectly” serve our functional needs, but leave our emotional needs in the lurch.
Operating systems were bubbly and evanescent, like nature. Apps were customizable, in every shape and size. And interfaces drew on real-life metaphors to help you understand them, integrating them effortlessly into your life.But as our everyday software tools and media became global for the first time, the hand of the artist gave way to the whims of the algorithm. And our software became one-size-fits-all in a world full of so many different people. All our opinions, beliefs, and ideas got averaged out — producing the least common denominator: endless sequels that everyone enjoys but no one truly loves.When our software optimizes for numbers alone — no matter the number — it appears doomed to lack a certain spirit, and a certain humanity.
In the end, we decided that we didn’t want to optimize for numbers at all. We wanted to optimize for feelings.While this may seem idealistic at best or naive at worst, the truth is that we already know how to do this. The most profound craftsmanship in our world across art, design, and media has long revolved around feelings.
When Olmstead crafted Central Park, what do you think he was optimizing for? Which metric led to Barry Jenkins’ Moonlight? What data brought the iPhone into this world? The answer is not numerical. It’s all about the feelings, opinions, experiences, and ideas of the maker themself. The great Georgia O’Keefe put it this way: "I have things in my head that are not like what anyone has taught me... so I decided to start anew."
Starting with feelings and then using data/metrics to bolster that feeling
James Turrell took inspiration from astronomy and perceptual psychology. Coco Chanel was most influenced by nuns and religious symbols. David Adjaye drew from Yoruban sculpture, and Steve Jobs from Zen Buddhism and calligraphy.
And yet, in so much modern software today, you’re placed in a drab gray cubicle — anonymized and aggregated until you’re just a daily active user. For minimalism. For simplicity. For scale! But if our hope is to create software with feeling, it means inviting people in to craft it for themselves — to mold it to the contours of their unique lives and taste.
You see — if software is to have soul, it must feel more like the world around it. Which is the biggest clue of all that feeling is what’s missing from today’s software. Because the value of the tools, objects, and artworks that we as humans have surrounded ourselves with for thousands of years goes so far beyond their functionality. In many ways, their primary value might often come from how they make us feel by triggering a memory, helping us carry on a tradition, stimulating our senses, or just creating a moment of peace.This is not to say that metrics should not play a role in what we do. The age of metrics has undeniably led us to some pretty remarkable things! And numbers are a useful measuring stick to keep ourselves honest.But if the religion of technology preaches anything, it celebrates progress and evolution. And so we ask, what comes next? What do we optimize for beyond numbers? How do we bring more of the world around us back into the software in front of us?
·browsercompany.substack.com·
Optimizing For Feelings
Instagram, TikTok, and the Three Trends
Instagram, TikTok, and the Three Trends
In other words, when Kylie Jenner posts a petition demanding that Meta “Make Instagram Instagram again”, the honest answer is that changing Instagram is the most Instagram-like behavior possible.
The first trend is the shift towards ever more immersive mediums. Facebook, for example, started with text but exploded with the addition of photos. Instagram started with photos and expanded into video. Gaming was the first to make this progression, and is well into the 3D era. The next step is full immersion — virtual reality — and while the format has yet to penetrate the mainstream this progression in mediums is perhaps the most obvious reason to be bullish about the possibility.
The second trend is the increase in artificial intelligence. I’m using the term colloquially to refer to the overall trend of computers getting smarter and more useful, even if those smarts are a function of simple algorithms, machine learning, or, perhaps someday, something approaching general intelligence.
The third trend is the change in interaction models from user-directed to computer-controlled. The first version of Facebook relied on users clicking on links to visit different profiles; the News Feed changed the interaction model to scrolling. Stories reduced that to tapping, and Reels/TikTok is about swiping. YouTube has gone further than anyone here: Autoplay simply plays the next video without any interaction required at all.
·stratechery.com·
Instagram, TikTok, and the Three Trends
Folk (Browser) Interfaces
Folk (Browser) Interfaces
For the layman to build their own Folk Interfaces, jigs to wield the media they care about, we must offer simple primitives. A designer in Blender thinks in terms of lighting, camera movements, and materials. An editor in Premiere, in sequences, transitions, titles, and colors. Critically, this is different from automating existing patterns, e.g. making it easy to create a website, simulate the visuals of film photography, or 3D-scan one's room. Instead, it's about building a playground in which those novel computational artifacts can be tinkered with and composed, via a grammar native to their own domain, to produce the fruits of the users' own vision. The goal of the computational tool-maker then is not to teach the layman about recursion, abstraction, or composition, but to provide meaningful primitives (i.e. a system) with which the user can do real work. End-user programming is a red herring: We need to focus on materiality, what some disparage as mere "side effects." The goal is to enable others to feel the agency and power that comes when the world ceases to be immutable.
This feels strongly related to another quote about software as ideology / a system of metaphors that influence the way we assign value to digital actions and content.
I hope this mode can paint the picture of software, not as a teleological instrument careening towards automation and ease, but as a medium for intimacy with the matter of our time (images, audio, video), yielding a sense of agency with what, to most, feels like an indelible substrate.
·cristobal.space·
Folk (Browser) Interfaces
Back to the Future of Twitter – Stratechery by Ben Thompson
Back to the Future of Twitter – Stratechery by Ben Thompson
This is all build-up to my proposal for what Musk — or any other bidder for Twitter, for that matter — ought to do with a newly private Twitter. First, Twitter’s current fully integrated model is a financial failure. Second, Twitter’s social graph is extremely valuable. Third, Twitter’s cultural impact is very large, and very controversial. Given this, Musk (who I will use as a stand-in for any future CEO of Twitter) should start by splitting Twitter into two companies. One company would be the core Twitter service, including the social graph. The other company would be all of the Twitter apps and the advertising business.
TwitterServiceCo would open up its API to any other company that might be interested in building their own client experience; each company would: Pay for the right to get access to the Twitter service and social graph. Monetize in whatever way they see fit (i.e. they could pursue a subscription model). Implement their own moderation policy. This last point would cut a whole host of Gordian Knots:
A truly open TwitterServiceCo has the potential to be a new protocol for the Internet — the notifications and identity protocol; unlike every other protocol, though, this one would be owned by a private company. That would be insanely valuable, but it is a value that will never be realized as long as Twitter is a public company led by a weak CEO and ineffective board driving an integrated business predicated on a business model that doesn’t work. Twitter’s Reluctance
·stratechery.com·
Back to the Future of Twitter – Stratechery by Ben Thompson
Stepping out of the firehose — Benedict Evans
Stepping out of the firehose — Benedict Evans
on information overload / infinite choice and how we struggle to manage it
The internet is a firehose. I don’t, myself, have 351 thousand unread emails, but when anyone can publish and connecting and sharing is free and frictionless, then there is always far more than we can possibly read. So how do we engage with that?
So your feed becomes a sample - an informed guess of the posts you might like most. This has always been a paradox of Facebook product - half the engineers work on adding stuff to your feed and the other half on taking stuff out. Snap proposed a different model - that if everything disappears after 24 hours then there’s less pressure to be great but also less pressure to read everything. You can let go. Tiktok takes this a step further - the feed is infinite, and there’s no pressure to get to the end, but also no signal to stop swiping. You replace pressure with addiction.
Another approach is to try to move the messages. Slack took emails from robots (support tickets, Salesforce updates) and moved them into channels, but now you have 50 channels full of unread messages instead of one inbox full of unread messages.
Screenshots are the PDFs of the smartphone. You pull something into physical space, sever all its links and metadata, and own it yourself.
Email newsletters look a little like this as well. I think a big part of the reason that people seem readier to pay for a blog post by email than a blog post on a web page is that somehow an email feels like a tangible, almost physical object - it might be part of that vast compost heap of unread emails, but at least it’s something that you have, and can come back to. This is also part of the resurgence of vinyl, and even audio cassettes.
The film-camera industry peaked at 80bn consumer photos a year, but today that number is well into the trillions, as I wrote here. That’s probably why people keep making camera apps with built-in constraints, but it also prompts a comparison with this summer’s NFT frenzy. Can digital objects have value, and can a signature add scarcity to a JPEG - can it make it individual?
there are now close to 5bn people with a smartphone, and all of us are online and saying and doing things, and you will never be able to read everything ever again. There’s an old line that Erasmus, in the 15th century, was the last person to have read everything - every book that there was - which might not have been literally possible but which was at least conceivable. Yahoo tried to read everything too - it tried to build a manually curated index of the entire internet that reached 3.2m sites before the absurdity of the project became overwhelming. This was Borges’s 1:1 scale map made real. So, we keep building tools, but also we let go. That’s part of the progression - Arts and Crafts was a reaction against what became the machine age, but Bauhaus and Futurism embraced it. If the ‘metaverse’ means anything, it reflects that we have all grown up with this now, and we’re looking at ways to absorb it, internalise it and reflect it in our lives and in popular culture - to take ownership of it. When software eats the world, it’s not software anymore.
·ben-evans.com·
Stepping out of the firehose — Benedict Evans
On the Internet, We’re Always Famous - The New Yorker
On the Internet, We’re Always Famous - The New Yorker
I’ve come to believe that, in the Internet age, the psychologically destabilizing experience of fame is coming for everyone. Everyone is losing their minds online because the combination of mass fame and mass surveillance increasingly channels our most basic impulses—toward loving and being loved, caring for and being cared for, getting the people we know to laugh at our jokes—into the project of impressing strangers, a project that cannot, by definition, sate our desires but feels close enough to real human connection that we cannot but pursue it in ever more compulsive ways.
It seems distant now, but once upon a time the Internet was going to save us from the menace of TV. Since the late fifties, TV has had a special role, both as the country’s dominant medium, in audience and influence, and as a bête noire for a certain strain of American intellectuals, who view it as the root of all evil. In “Amusing Ourselves to Death,” from 1985, Neil Postman argues that, for its first hundred and fifty years, the U.S. was a culture of readers and writers, and that the print medium—in the form of pamphlets, broadsheets, newspapers, and written speeches and sermons—structured not only public discourse but also modes of thought and the institutions of democracy itself. According to Postman, TV destroyed all that, replacing our written culture with a culture of images that was, in a very literal sense, meaningless. “Americans no longer talk to each other, they entertain each other,” he writes. “They do not exchange ideas; they exchange images. They do not argue with propositions; they argue with good looks, celebrities and commercials.”
·newyorker.com·
On the Internet, We’re Always Famous - The New Yorker