Found 4 bookmarks
Newest
Fandom's Great Divide
Fandom's Great Divide
The 1970s sitcom "All in the Family" sparked debates with its bigoted-yet-lovable Archie Bunker character, leaving audiences divided over whether the show was satirizing prejudice or inadvertently promoting it, and reflecting TV's power to shape societal attitudes.
This sort of audience divide, not between those who love a show and those who hate it but between those who love it in very different ways, has become a familiar schism in the past fifteen years, during the rise of—oh, God, that phrase again—Golden Age television. This is particularly true of the much lauded stream of cable “dark dramas,” whose protagonists shimmer between the repulsive and the magnetic. As anyone who has ever read the comments on a recap can tell you, there has always been a less ambivalent way of regarding an antihero: as a hero
a subset of viewers cheered for Walter White on “Breaking Bad,” growling threats at anyone who nagged him to stop selling meth. In a blog post about that brilliant series, I labelled these viewers “bad fans,” and the responses I got made me feel as if I’d poured a bucket of oil onto a flame war from the parapets of my snobby critical castle. Truthfully, my haters had a point: who wants to hear that they’re watching something wrong?
·newyorker.com·
Fandom's Great Divide
Why corporate America broke up with design
Why corporate America broke up with design
Design thinking alone doesn't determine market success, nor does it always transform business as expected.
There are a multitude of viable culprits behind this revenue drop. Robson himself pointed to the pandemic and tightened global budgets while arguing that “the widespread adoption of design thinking . . . has reduced demand for our services.” (Ideo was, in part, its own competition here since for years, it sold courses on design thinking.) It’s perhaps worth noting that, while design thinking was a buzzword from the ’90s to the early 2010s, it’s commonly met with all sorts of criticism today.
“People were like, ‘We did the process, why doesn’t our business transform?'” says Cliff Kuang, a UX designer and coauthor of User Friendly (and a former Fast Company editor). He points to PepsiCo, which in 2012 hired its first chief design officer and opened an in-house design studio. The investment has not yielded a string of blockbusters (and certainly no iPhone for soda). One widely promoted product, Drinkfinity, attempted to respond to diminishing soft-drink sales with K-Cup-style pods and a reusable water bottle. The design process was meticulous, with extensive prototyping and testing. But Drinkfinity had a short shelf life, discontinued within two years of its 2018 release.
“Design is rarely the thing that determines whether something succeeds in the market,” Kuang says. Take Amazon’s Kindle e-reader. “Jeff Bezos henpecked the original Kindle design to death. Because he didn’t believe in capacitive touch, he put a keyboard on it, and all this other stuff,” Kuang says. “Then the designer of the original Kindle walked and gave [the model] to Barnes & Noble.” Barnes & Noble released a product with a superior physical design, the Nook. But design was no match for distribution. According to the most recent data, Amazon owns approximately 80% of the e-book market share.
The rise of mobile computing has forced companies to create effortless user experiences—or risk getting left behind. When you hail an Uber or order toilet paper in a single click, you are reaping the benefits of carefully considered design. A 2018 McKinsey study found that companies with the strongest commitment to design and the best execution of design principles had revenue that was 32 percentage points higher—and shareholder returns that were 56 percentage points higher—than other companies.
·fastcompany.com·
Why corporate America broke up with design
Designing in Winter
Designing in Winter
As the construction industry matured, and best practices were commodified, the percentage of buildings requiring the direct involvement of architects plummeted. Builders can now choose from an array of standard layouts that cover most of their needs; materials and design questions, too, have been standardized, and reflect economies of scale more than local or unique contextual realities.
Cities have lots of rules and regulation about how things can be designed and built, reducing the need for and value of creativity
The situation is similar in our field. In 2009, companies might ask a designer to “imagine the shoe-shopping experience on mobile,” and such a designer would need to marshal a considerable number of skills to do so: research into how such activity happens today and how it had been attempted online before and the psychology of people engaged in it; explorations of many kinds of interfaces, since no one really knew yet how to present these kinds of information on smartphones; market investigations to determine e.g. “what % of prospective shoppers have which kinds of devices, and what designs can accommodate them all”; testing for raw usability: can people even figure out what to do when they see these screens? And so on.In 2023, the scene is very different. Best practices in most forms of software and services are commodified; we know, from a decade plus of market activity, what works for most people in a very broad range of contexts. Standardization is everywhere, and resources for the easy development of UIs abound.
It’s also the case that if a designer adds 15% to a design’s quality but increases cycle time substantially, is another cook in the kitchen, demands space for ideation or research, and so on, the trade-off will surely start to seem debatable to many leaders, and that’s ignoring FTE costs! We can be as offended by this as we want, but the truth is that the ten millionth B2B SaaS startup can probably validate or falsify product-market-fit without hiring Jony Ive and an entire team of specialists.
We design apps downstream of how Apple designs iOS. There’s just not that much room for innovating in UI at the moment
Today, for a larger-than-ever percentage of projects, some good libraries and guidelines like Apple’s HIG can get non-designers where they need to go. Many companies could probably do very well with1 designer to do native design + create and maintain a design systemPMs and executives for ideationFront-end engineers working off of the design system / component library to implement ideasSo even where commodification doesn’t mean no designers, it still probably means fewer designers.
If, for example, they land AR / VR, we will once again face a world of businesses who need to figure out how their goods and services make sense in a new context: how should we display Substack posts in AR, for example? Which metaphors should persist into the new world? What’s the best way to shop for shoes in VR? What affordances empower the greatest number of people?
But there will at least be another period when engineers who “just ship” will produce such massively worse user interfaces that software designers will be important again.
“design process” and “design cycles” are under pressure and may face much more soon. Speed helps, and so too does a general orientation towards working with production however it’s happening. This basically sums to: “Be less precious, and try to fit in in whatever ways help your company ship.”
being capable of more of the work of making software can mean becoming better at strategy and ideation, such that you’re ever executive’s favorite collaborative partner; you listen well, you mock fast (maybe with AI), and you help them communicate; or it can mean becoming better at execution, learning, for example, to code.
·suckstosuck.substack.com·
Designing in Winter
Studio Branding in the Streaming Wars
Studio Branding in the Streaming Wars
The race for the streamers to configure themselves as full-service production, distribution, and exhibition outlets has intensified the need for each to articulate a more specific brand identity.
What we are seeing with the streaming wars is not the emergence of a cluster of copy-cat services, with everyone trying to do everything, but the beginnings of a legible strategy to carve up the mediascape and compete for peoples’ waking hours.
Netflix’s penchant for character-centered stories with a three-act structure, as well as high production values (an average of $20–$50-plus million for award contenders), resonates with the “quality” features of the Classical era.
rom early on, Netflix cultivated a liberal public image, which has propelled its investment in social documentary and also driven some of its inclusivity initiatives and collaborations with global auteurs and showrunners of color, such as Alfonso Cuarón, Ava DuVernay, Spike Lee, and Justin Simien.
Quibi as short for “Quick Bites.” In turn, the promos wouldn’t so much emphasize “the what” of the programming as the interest and convenience of being able to watch it while waiting, commuting, or just taking a break. However, this unit of prospective viewing time lies uncomfortably between the ultra-brief TikTok video and the half-hour sitcom.
Peacock’s central obstacle moving forward will be convincing would-be subscribers that the things they loved about linear broadcast and cable TV are worth the investment.
One of the most intriguing and revealing of metaphors, however, isn’t so much related to war as celestial coexistence of streamer-planets within the “universe.” Certainly, the term resonates with key franchises, such as the “Marvel Cinematic Universe,” and the bevvy of intricate stories that such an expansive environment makes possible. This language stakes a claim for the totality of media — that there are no other kinds of moving images beyond what exists on, or what can be imagined for, these select platforms.
·lareviewofbooks.org·
Studio Branding in the Streaming Wars