Saved

Saved

3676 bookmarks
Newest
Design with materials, not features | thesephist.com
Design with materials, not features | thesephist.com
Material-based software can also have gentler learning curves, because the user only needs to learn a small set of rules about how different metaphors in the software interact with each other rather than learning deep hierarchies of menus and terminologies. In the best case, users can continue to discover new capabilities and workflows long after the initial release of the software.
Take nonlinear video editing software, like Final Cut Pro and Premiere Pro, for example. Though they have their fair share of menu complexity, it’s pretty intuitive for anyone to understand the basic building blocks of video and audio clips sitting on a layered timeline. Here, the video and audio clips are composed of a software material that have their own laws of physics: they can grow and shrink to take up more or less time on the timeline. They can be moved around and layered in front of or behind other clips, but they can’t occupy the same space on the same “track” of the timeline. The timeline that runs left-to-right is a kind of “world” in which the materials of audio and video exist.
·thesephist.com·
Design with materials, not features | thesephist.com
When to Design for Emergence
When to Design for Emergence
In complexity science, ‘emergence’ describes the way that interactions between individual components in a complex system can give rise to new behavior, patterns, or qualities. For example, the quality of ‘wetness’ cannot be found in a single water molecule, but instead arises from the interaction of many water molecules together. In living systems, emergence is at the core of adaptive evolution.
Design for emergence prioritizes open-ended combinatorial possibilities such that the design object can be composed and adapted to a wide variety of contextual and idiosyncratic niches by its end-user. LEGO offers an example — a simple set of blocks with a shared protocol for connecting to one another from which a nearly infinite array of forms can emerge. Yet as we will see, design for emergence can generate value well beyond children’s toys.
In contrast to high modern design, user-centered design takes a more modest position; the designer does not inherently know everything, and therefore she must meticulously study the needs and behaviors of users in order to produce a good design. User-centered design remains the dominant design paradigm today, employed by environmental designers, tech companies, and design agencies around the world.
In this paradigm, design is about gaining knowledge from the user, identifying desirable outcomes, and controlling as much of the process as possible to achieve those outcomes. ‘Design’ remains synonymous with maximizing control.
But consider even the ‘desire path’ example pictured above. The modal user may be well supported by paving the desire path indicated by their behavior, but what good is a paved path leading to stairs for a wheelchair user? In practice, user-centered design tends to privilege the modal user at the expense of the long-tail user whose needs may be just as great.
User-centered design has a better track record than high modern design, but it still exerts a homogenizing effect. The needs of the modal user are accommodated and scaled through software or industrial manufacturing, while power users and those with edge cases can do nothing but actively petition the designer for attention. In most cases, diverse users with a wide variety of niche use cases are forced to conform to the behavior of the modal user.
In design for emergence, the designer assumes that the end-user holds relevant knowledge and gives them extensive control over the design. Rather than designing the end result, we design the user’s experience of designing their own end result. In this way we can think of design for emergence as a form of ‘meta-design.’
In other words, to address the long-tail problem, the tool must be flexible enough that it can be adapted to unexpected and idiosyncratic problem spaces—especially those unanticipated by the tool’s designer.
In contrast to user-centered design, design for emergence invites the user into the design process not only as a subject of study, but as a collaborator with agency and control.
What all these tools have in common is support for open-ended adaptation to highly contextual problems without the need for technical knowledge. Rather than building a static, purpose-built solution to a single common problem with lots of users (and lots of competitors), they’ve won robust user bases by supporting a broad swath of long-tail user needs.
Design for emergence is composable. It provides a limited ‘alphabet’ and a generative grammar that’s easy to learn and employ, yet can be extended to create powerful, complex applications. As Seymour Papert once remarked, “English is a language for children,” but this fact, “does not preclude its being also a language for poets, scientists, and philosophers.”
·rhizomerd.substack.com·
When to Design for Emergence
My Tinder Decade
My Tinder Decade
Subject after subject reported that they were on Tinder to find someone to love and to love them back and defined love in the most traditional of terms: something that took work, a container in which sex was sacred and where intimacy built over time. They acknowledged that their encounters on Tinder didn’t offer that, yet they went to Tinder to find it. The contradiction was confusing: They wanted sex to be meaningful but felt that Tinder removed the sacredness. They wanted bonds to be lasting but acknowledged they were easily broken.
·thecut.com·
My Tinder Decade
🚨 Instagram walks back its changes
🚨 Instagram walks back its changes
When we launched Instagram, there were no stories, there were no DMs. What's happened over the last decade is that how people share with friends has changed. It has shifted to stories, and it has shifted to DMs and to group chats. More photos and videos are shared in DMs in a day, then are shared into stories. And more photos and videos are shared into stories in a day than are shared to feed. I don't think connecting people with their friends and family is any less important to us than it was two years ago, or five years ago, or eight years ago. But how that works, and how we try and meet that need, has changed as how people communicate with their friends has changed. Which then begs the question, what's the future of feed? And in a world where more of the friend content has gone from feed into stories and DMs, I think that feed is going to become more public in nature. We want to steer it, to the degree we can, towards creators and individuals, and less towards publishers and institutions. (Though obviously they will always be the platform, too.) But we also think that creators’ public content can connect you to friends. Feed could be, and to some degree is, a place to discover things to talk about with your friends. With Reels, we're seeing this happen a lot. Reels are inspiring a lot of conversations — people just send funny videos to their friends that they've discovered in feed. And then they start talking about other things — and we think that is great, too.
·platformer.news·
🚨 Instagram walks back its changes
The Metaverse Could Change The World, If We Could Stop Getting In Its Way
The Metaverse Could Change The World, If We Could Stop Getting In Its Way
Nick Clegg, president of Meta, penned a lengthy Medium post trying to portray a more positive vision to justify the company’s pivot to all-in for the metaverse future. To Clegg, the metaverse is “ultimately about finding ever more ways for the benefits of the online world to be felt in our daily lives.” This frame is backwards, and reinforces the technology-first lens of social construction that has not held up well over time. A better lens to explain the inevitability of the metaverse is that technology will, over time, provide ever more ways for the benefits of the offline world to be felt through online services. Today’s social media is about communication; the metaverse of tomorrow will be about experiences. Its value is not inherent, but rather lies in the ability of technology to recreate and transport things—in particular, experiences—that have inherent value.
Metaverse technology will, in all likelihood, follow that same evolution. Today’s VR headsets are the equivalent of phonographs and radio. But someday, metaverse users will be able to mix their favorite cocktails in their homes, strap on their multisensory gear, and “walk” into a virtual bar to socialize with friends located anywhere in the world. They’d save the $18 that cocktail costs in a physical bar, and spend it on the $18/month subscription fee for their VirtualBar membership. As the world continues to suffer from the COVID-19 pandemic, it’s easy to see the appeal of virtual substitutes for in-person experiences. In 2020, Zoom replaced in-person social events and professional collaborations; in 2030, the technology options will be far richer, and will capture and convey even more (though not all) of the subtle interpersonal dynamics inherent in human interaction.
·techdirt.com·
The Metaverse Could Change The World, If We Could Stop Getting In Its Way
Data-Driven Design is Killing Our Instincts
Data-Driven Design is Killing Our Instincts
It creates more generic-looking interfaces that may perform well in numbers but fall short of appealing to our senses.
It’s easy to make data-driven design decisions, but relying on data alone ignores that some goals are difficult to measure. Data is very useful for incremental, tactical changes, but only if it’s checked and balanced by our instincts and common sense.
It became clear to the team in that moment that we cared about more than just clicks. We had other goals for this design: It needed to set expectations about what happens next, it needed to communicate quality, and we wanted it to build familiarity and trust in our brand.We could have easily measured how many customers clicked one button versus another, and used that data to pick an optimal button. But that approach would have ignored the big picture and other important goals.
Not everything that can be counted counts. Not everything that counts can be counted.Data is good at measuring things that are easy to measure. Some goals are less tangible, but that doesn’t make them less important.While you’re chasing a 2% increase in conversion rate you may be suffering a 10% decrease in brand trustworthiness. You’ve optimized for something that’s objectively measured, at the cost of goals that aren’t so easily codified.
Design instinct is a lot more than innate creative ability and cultural guesswork. It’s your wealth of experience. It’s familiarity with industry standards and best practices.
Overreliance on data to drive design decisions can be just as harmful as ignoring it. Data only tells one kind of story. But your project goals are often more complex than that. Goals can’t always be objectively measured.
·modus.medium.com·
Data-Driven Design is Killing Our Instincts
Critical Attrition | The Editors
Critical Attrition | The Editors
The main problem with the book review today is not that its practitioners live in New York, as some contend. It is not that the critics are in cahoots with the authors under review, embroiled in a shadow economy of social obligation and quid pro quo favor trading. The problem is not that book reviews are too mean or too nice, too long or too short, though they may be those things, too. The main problem is that the contemporary American book review is first and foremost an audition — for another job, another opportunity, another day in the content mine, hopefully with better lighting and tools, but at the very least with better pay. What kind of job or opportunity for the reviewer depends on her ambitions.
He wants honest reviews of novels; instead he gets hype and a dizzying, outrageous, stultifying profusion of adjectives. He wants serious literary criticism of novels; instead he gets withering assessments of the era in which said novels are written, which, by endlessly discussing the same five novels, only confirm his fears that literature has reached the unsustainably small gene-pool era of its long, slow slide to extinction. The contemporary reviewer is unhappy too: she works too hard, and still everything she does is wrong and insufficiently compensated. Her careful reviews end up reading like stenography, and when she swings for the fences her actual readers — unlike the trigger-happy tweeters — complain that she has swung too far.
·nplusonemag.com·
Critical Attrition | The Editors
Kevin Kelly on Why Technology Has a Will
Kevin Kelly on Why Technology Has a Will
The game is that every time we create a new technology, we’re creating new possibilities, new choices that didn’t exist before. Those choices themselves—even the choice to do harm—are a good, they’re a plus.
We want an economy that’s growing in the second sense: unlimited betterment, unlimited increase in wisdom, and complexity, and choices. I don’t see any limit there. We don’t want an economy that’s just getting fatter and fatter, and bigger and bigger, in terms of its size. Can we imagine such a system? That’s hard, but I don’t think it’s impossible.
·palladiummag.com·
Kevin Kelly on Why Technology Has a Will
On the Social Media Ideology
On the Social Media Ideology
Social networking is much more than just a dominant discourse. We need to go beyond text and images and include its software, interfaces, and networks that depend on a technical infrastructure consisting of offices and their consultants and cleaners, cables and data centers, working in close concert with the movements and habits of the connected billions. Academic internet studies circles have shifted their attention from utopian promises, impulses, and critiques to “mapping” the network’s impact. From digital humanities to data science we see a shift in network-oriented inquiry from Whether and Why, What and Who, to (merely) How. From a sociality of causes to a sociality of net effects. A new generation of humanistic researchers is lured into the “big data” trap, and kept busy capturing user behavior whilst producing seductive eye candy for an image-hungry audience (and vice versa).
We need to politicize the New Electricity, the privately owned utilities of our century, before they disappear into the background.
What remains particularly unexplained is the apparent paradox between the hyper-individualized subject and the herd mentality of the social.
Before we enter the social media sphere, everyone first fills out a profile and choses a username and password in order to create an account. Minutes later, you’re part of the game and you start sharing, creating, playing, as if it has always been like that. The profile is the a priori part and the profiling and targeted advertising cannot operate without it. The platforms present themselves as self-evident. They just are—facilitating our feature-rich lives. Everyone that counts is there. It is through the gate of the profile that we become its subject.
We pull in updates, 24/7, in a real-time global economy of interdependencies, having been taught to read news feeds as interpersonal indicators of the planetary condition
Treating social media as ideology means observing how it binds together media, culture, and identity into an ever-growing cultural performance (and related “cultural studies”) of gender, lifestyle, fashion, brands, celebrity, and news from radio, television, magazines, and the web—all of this imbricated with the entrepreneurial values of venture capital and start-up culture, with their underside of declining livelihoods and growing inequality.
Software, or perhaps more precisely operating systems, offer us an imaginary relationship to our hardware: they do not represent transistors but rather desktops and recycling bins. Software produces users. Without operating system (OS) there would be no access to hardware; without OS no actions, no practices, and thus no user. Each OS, through its advertisements, interpellates a “user”: calls it and offers it a name or image with which to identify. We could say that social media performs the same function, and is even more powerful.
In the age of social media we seem to confess less what we think. It’s considered too risky, too private. We share what we do, and see, in a staged manner. Yes, we share judgments and opinions, but no thoughts. Our Self is too busy for that, always on the move, flexible, open, sporty, sexy, and always ready to connect and express.
Platforms are not stages; they bring together and synthesize (multimedia) data, yes, but what is lacking here is the (curatorial) element of human labor. That’s why there is no media in social media. The platforms operate because of their software, automated procedures, algorithms, and filters, not because of their large staff of editors and designers. Their lack of employees is what makes current debates in terms of racism, anti-Semitism, and jihadism so timely, as social media platforms are currently forced by politicians to employ editors who will have to do the all-too-human monitoring work (filtering out ancient ideologies that refuse to disappear).
·e-flux.com·
On the Social Media Ideology