Found 15 bookmarks
Newest
Shop Class as Soulcraft
Shop Class as Soulcraft

Summary: Skilled manual labor entails a systematic encounter with the material world that can enrich one's intellectual and spiritual life. The degradation of work in both blue-collar and white-collar professions is driven not just by technological progress, but by the separation of thinking from doing according to the dictates of capital. To realize the full potential of human flourishing, we must reckon with the appeal of skilled manual work and question the assumptions that shape our educational priorities and notions of a good life.

an engineering culture has developed in recent years in which the object is to “hide the works,” rendering the artifacts we use unintelligible to direct inspection. Lift the hood on some cars now (especially German ones), and the engine appears a bit like the shimmering, featureless obelisk that so enthralled the cavemen in the opening scene of the movie 2001: A Space Odyssey. Essentially, there is another hood under the hood.
What ordinary people once made, they buy; and what they once fixed for themselves, they replace entirely or hire an expert to repair, whose expert fix often involves installing a pre-made replacement part.
So perhaps the time is ripe for reconsideration of an ideal that has fallen out of favor: manual competence, and the stance it entails toward the built, material world. Neither as workers nor as consumers are we much called upon to exercise such competence, most of us anyway, and merely to recommend its cultivation is to risk the scorn of those who take themselves to be the most hard-headed: the hard-headed economist will point out the opportunity costs of making what can be bought, and the hard-headed educator will say that it is irresponsible to educate the young for the trades, which are somehow identified as the jobs of the past.
It was an experience of agency and competence. The effects of my work were visible for all to see, so my competence was real for others as well; it had a social currency. The well-founded pride of the tradesman is far from the gratuitous “self-esteem” that educators would impart to students, as though by magic.
Skilled manual labor entails a systematic encounter with the material world, precisely the kind of encounter that gives rise to natural science. From its earliest practice, craft knowledge has entailed knowledge of the “ways” of one’s materials — that is, knowledge of their nature, acquired through disciplined perception and a systematic approach to problems.
Because craftsmanship refers to objective standards that do not issue from the self and its desires, it poses a challenge to the ethic of consumerism, as the sociologist Richard Sennett has recently argued. The craftsman is proud of what he has made, and cherishes it, while the consumer discards things that are perfectly serviceable in his restless pursuit of the new.
The central culprit in Braverman’s account is “scientific management,” which “enters the workplace not as the representative of science, but as the representative of management masquerading in the trappings of science.” The tenets of scientific management were given their first and frankest articulation by Frederick Winslow Taylor
Scattered craft knowledge is concentrated in the hands of the employer, then doled out again to workers in the form of minute instructions needed to perform some part of what is now a work process. This process replaces what was previously an integral activity, rooted in craft tradition and experience, animated by the worker’s own mental image of, and intention toward, the finished product. Thus, according to Taylor, “All possible brain work should be removed from the shop and centered in the planning or lay-out department.” It is a mistake to suppose that the primary purpose of this partition is to render the work process more efficient. It may or may not result in extracting more value from a given unit of labor time. The concern is rather with labor cost. Once the cognitive aspects of the job are located in a separate management class, or better yet in a process that, once designed, requires no ongoing judgment or deliberation, skilled workers can be replaced with unskilled workers at a lower rate of pay.
the “jobs of the future” rhetoric surrounding the eagerness to end shop class and get every warm body into college, thence into a cubicle, implicitly assumes that we are heading to a “post-industrial” economy in which everyone will deal only in abstractions. Yet trafficking in abstractions is not the same as thinking. White collar professions, too, are subject to routinization and degradation, proceeding by the same process as befell manual fabrication a hundred years ago: the cognitive elements of the job are appropriated from professionals, instantiated in a system or process, and then handed back to a new class of workers — clerks — who replace the professionals. If genuine knowledge work is not growing but actually shrinking, because it is coming to be concentrated in an ever-smaller elite, this has implications for the vocational advice that students ought to receive.
The trades are then a natural home for anyone who would live by his own powers, free not only of deadening abstraction, but also of the insidious hopes and rising insecurities that seem to be endemic in our current economic life. This is the stoic ideal.
·thenewatlantis.com·
Shop Class as Soulcraft
The most hated workplace software on the planet
The most hated workplace software on the planet
LinkedIn, Reddit, and Blind abound with enraged job applicants and employees sharing tales of how difficult it is to book paid leave, how Kafkaesque it is to file an expense, how nerve-racking it is to close out a project. "I simply hate Workday. Fuck them and those who insist on using it for recruitment," one Reddit user wrote. "Everything is non-intuitive, so even the simplest tasks leave me scratching my head," wrote another. "Keeping notes on index cards would be more effective." Every HR professional and hiring manager I spoke with — whose lives are supposedly made easier by Workday — described Workday with a sense of cosmic exasperation.
If candidates hate Workday, if employees hate Workday, if HR people and managers processing and assessing those candidates and employees through Workday hate Workday — if Workday is the most annoying part of so many workers' workdays — how is Workday everywhere? How did a software provider so widely loathed become a mainstay of the modern workplace?
This is a saying in systems thinking: The purpose of a system is what it does (POSIWID), not what it fails to do. And the reality is that what Workday — and its many despised competitors — does for organizations is far more important than the anguish it causes everyone else.
In 1988, PeopleSoft, backed by IBM, built the first fully fledged Human Resources Information System. In 2004, Oracle acquired PeopleSoft for $10.3 billion. One of its founders, David Duffield, then started a new company that upgraded PeopleSoft's model to near limitless cloud-based storage — giving birth to Workday, the intractable nepo baby of HR software.
Workday is indifferent to our suffering in a job hunt, because we aren't Workday's clients, companies are. And these companies — from AT&T to Bank of America to Teladoc — have little incentive to care about your application experience, because if you didn't get the job, you're not their responsibility. For a company hiring and onboarding on a global scale, it is simply easier to screen fewer candidates if the result is still a single hire.
A search on a job board can return hundreds of listings for in-house Workday consultants: IT and engineering professionals hired to fix the software promising to fix processes.
For recruiters, Workday also lacks basic user-interface flexibility. When you promise ease-of-use and simplicity, you must deliver on the most basic user interactions. And yet: Sometimes searching for a candidate, or locating a candidate's status feels impossible. This happens outside of recruiting, too, where locating or attaching a boss's email to approve an expense sheet is complicated by the process, not streamlined. Bureaucratic hell is always about one person's ease coming at the cost of someone else's frustration, time wasted, and busy work. Workday makes no exceptions.
Workday touts its ability to track employee performance by collecting data and marking results, but it is employees who must spend time inputting this data. A creative director at a Fortune 500 company told me how in less than two years his company went "from annual reviews to twice-annual reviews to quarterly reviews to quarterly reviews plus separate twice-annual reviews." At each interval higher-ups pressed HR for more data, because they wanted what they'd paid for with Workday: more work product. With a press of a button, HR could provide that, but the entire company suffered thousands more hours of busy work. Automation made it too easy to do too much. (Workday's "customers choose the frequency at which they conduct reviews, not Workday," said the spokesperson.)
At the scale of a large company, this is simply too much work to expect a few people to do and far too user-specific to expect automation to handle well. It's why Workday can be the worst while still allowing that Paychex is the worst, Paycom is the worst, Paycor is the worst, and Dayforce is the worst. "HR software sucking" is a big tent.
Workday finds itself between enshittification steps two and three. The platform once made things faster, simpler for workers. But today it abuses workers by cutting corners on job-application and reimbursement procedures. In the process, it provides the value of a one-stop HR shop to its paying customers. It seems it's only a matter of time before Workday and its competitors try to split the difference and cut those same corners with the accounts that pay their bills.
Workday reveals what's important to the people who run Fortune 500 companies: easily and conveniently distributing busy work across large workforces. This is done with the arbitrary and perfunctory performance of work tasks (like excessive reviews) and with the throttling of momentum by making finance and HR tasks difficult. If your expenses and reimbursements are difficult to file, that's OK, because the people above you don't actually care if you get reimbursed. If it takes applicants 128% longer to apply, the people who implemented Workday don't really care. Throttling applicants is perhaps not intentional, but it's good for the company.
·businessinsider.com·
The most hated workplace software on the planet
How McKinsey Destroyed the Middle Class - The Atlantic
How McKinsey Destroyed the Middle Class - The Atlantic

The rise of management consulting firms like McKinsey played a pivotal role in disempowering the American middle class by promoting corporate restructuring that concentrated power and wealth in the hands of elite managers while stripping middle managers and workers of their decision-making roles, job security, and opportunities for career advancement.

Key topics:

  • Management consulting's role in reshaping corporate America
  • The decline of the middle class and the rise of corporate elitism
  • McKinsey's influence on corporate restructuring and inequality
  • The shift from lifetime employment to precarious jobs
  • The erosion of corporate social responsibility
  • The role of management consulting in perpetuating economic inequality
what consequences has the rise of management consulting had for the organization of American business and the lives of American workers? The answers to these questions put management consultants at the epicenter of economic inequality and the destruction of the American middle class.
Managers do not produce goods or deliver services. Instead, they plan what goods and services a company will provide, and they coordinate the production workers who make the output. Because complex goods and services require much planning and coordination, management (even though it is only indirectly productive) adds a great deal of value. And managers as a class capture much of this value as pay. This makes the question of who gets to be a manager extremely consequential.
In the middle of the last century, management saturated American corporations. Every worker, from the CEO down to production personnel, served partly as a manager, participating in planning and coordination along an unbroken continuum in which each job closely resembled its nearest neighbor.
Even production workers became, on account of lifetime employment and workplace training, functionally the lowest-level managers. They were charged with planning and coordinating the development of their own skills to serve the long-run interests of their employers.
At McDonald’s, Ed Rensi worked his way up from flipping burgers in the 1960s to become CEO. More broadly, a 1952 report by Fortune magazine found that two-thirds of senior executives had more than 20 years’ service at their current companies.
Top executives enjoyed commensurately less control and captured lower incomes. This democratic approach to management compressed the distribution of income and status. In fact, a mid-century study of General Motors published in the Harvard Business Review—completed, in a portent of what was to come, by McKinsey’s Arch Patton—found that from 1939 to 1950, hourly workers’ wages rose roughly three times faster than elite executives’ pay. The management function’s wide diffusion throughout the workforce substantially built the mid-century middle class.
The earliest consultants were engineers who advised factory owners on measuring and improving efficiency at the complex factories required for industrial production. The then-leading firm, Booz Allen, did not achieve annual revenues of $2 million until after the Second World War. McKinsey, which didn’t hire its first Harvard M.B.A. until 1953, retained a diffident and traditional ethos
A new ideal of shareholder primacy, powerfully championed by Milton Friedman in a 1970 New York Times Magazine article entitled “The Social Responsibility of Business is to Increase its Profits,” gave the newly ambitious management consultants a guiding purpose. According to this ideal, in language eventually adopted by the Business Roundtable, “the paramount duty of management and of boards of directors is to the corporation’s stockholders.” During the 1970s, and accelerating into the ’80s and ’90s, the upgraded management consultants pursued this duty by expressly and relentlessly taking aim at the middle managers who had dominated mid-century firms, and whose wages weighed down the bottom line.
Management consultants thus implemented and rationalized a transformation in the American corporation. Companies that had long affirmed express “no layoff” policies now took aim at what the corporate raider Carl Icahn, writing in the The New York Times in the late 1980s, called “corporate bureaucracies” run by “incompetent” and “inbred” middle managers. They downsized in response not to particular business problems but rather to a new managerial ethos and methods; they downsized when profitable as well as when struggling, and during booms as well as busts.
Downsizing was indeed wrenching. When IBM abandoned lifetime employment in the 1990s, local officials asked gun-shop owners around its headquarters to close their stores while employees absorbed the shock.
In some cases, downsized employees have been hired back as subcontractors, with no long-term claim on the companies and no role in running them. When IBM laid off masses of workers in the 1990s, for example, it hired back one in five as consultants. Other corporations were built from scratch on a subcontracting model. The clothing brand United Colors of Benetton has only 1,500 employees but uses 25,000 workers through subcontractors.
Shift from lifetime employment to reliance on outsourced labor; decline in unions
The shift from permanent to precarious jobs continues apace. Buttigieg’s work at McKinsey included an engagement for Blue Cross Blue Shield of Michigan, during a period when it considered cutting up to 1,000 jobs (or 10 percent of its workforce). And the gig economy is just a high-tech generalization of the sub-contractor model. Uber is a more extreme Benetton; it deprives drivers of any role in planning and coordination, and it has literally no corporate hierarchy through which drivers can rise up to join management.
In effect, management consulting is a tool that allows corporations to replace lifetime employees with short-term, part-time, and even subcontracted workers, hired under ever more tightly controlled arrangements, who sell particular skills and even specified outputs, and who manage nothing at all.
the managerial control stripped from middle managers and production workers has been concentrated in a narrow cadre of executives who monopolize planning and coordination. Mid-century, democratic management empowered ordinary workers and disempowered elite executives, so that a bad CEO could do little to harm a company and a good one little to help it.
Whereas at mid-century a typical large-company CEO made 20 times a production worker’s income, today’s CEOs make nearly 300 times as much. In a recent year, the five highest-paid employees of the S&P 1500 (7,500 elite executives overall), obtained income equal to about 10 percent of the total profits of the entire S&P 1500.
as Kiechel put it dryly, “we are not all in this together; some pigs are smarter than other pigs and deserve more money.” Consultants seek, in this way, to legitimate both the job cuts and the explosion of elite pay. Properly understood, the corporate reorganizations were, then, not merely technocratic but ideological.
corporate reorganizations have deprived companies of an internal supply of managerial workers. When restructurings eradicated workplace training and purged the middle rungs of the corporate ladder, they also forced companies to look beyond their walls for managerial talent—to elite colleges, business schools, and (of course) to management-consulting firms. That is to say: The administrative techniques that management consultants invented created a huge demand for precisely the services that the consultants supply.
Consulting, like law school, is an all-purpose status giver—“low in risk and high in reward,” according to the Harvard Crimson. McKinsey also hopes that its meritocratic excellence will legitimate its activities in the eyes of the broader world. Management consulting, Kiechel observed, acquired its power and authority not from “silver-haired industry experience but rather from the brilliance of its ideas and the obvious candlepower of the people explaining them, even if those people were twenty-eight years old.”
A deeper objection to Buttigieg’s association with McKinsey concerns not whom the firm represents but the central role the consulting revolution has played in fueling the enormous economic inequalities that now threaten to turn the United States into a caste society.
Meritocrats like Buttigieg changed not just corporate strategies but also corporate values.
GM may aspire to build good cars; IBM, to make typewriters, computers, and other business machines; and AT&T, to improve communications. Executives who rose up through these companies, on the mid-century model, were embedded in their firms and embraced these values, so that they might even have come to view profits as a salutary side effect of running their businesses well.
When management consulting untethered executives from particular industries or firms and tied them instead to management in general, it also led them to embrace the one thing common to all corporations: making money for shareholders. Executives raised on the new, untethered model of management aim exclusively and directly at profit: their education, their career arc, and their professional role conspire to isolate them from other workers and train them single-mindedly on the bottom line.
American democracy, the left believes, cannot be rejuvenated by persuading elites to deploy their excessive power somehow more benevolently. Instead, it requires breaking the stranglehold that elites have on our economics and politics, and reempowering everyone else.
·archive.is·
How McKinsey Destroyed the Middle Class - The Atlantic
Fandom's Great Divide
Fandom's Great Divide
The 1970s sitcom "All in the Family" sparked debates with its bigoted-yet-lovable Archie Bunker character, leaving audiences divided over whether the show was satirizing prejudice or inadvertently promoting it, and reflecting TV's power to shape societal attitudes.
This sort of audience divide, not between those who love a show and those who hate it but between those who love it in very different ways, has become a familiar schism in the past fifteen years, during the rise of—oh, God, that phrase again—Golden Age television. This is particularly true of the much lauded stream of cable “dark dramas,” whose protagonists shimmer between the repulsive and the magnetic. As anyone who has ever read the comments on a recap can tell you, there has always been a less ambivalent way of regarding an antihero: as a hero
a subset of viewers cheered for Walter White on “Breaking Bad,” growling threats at anyone who nagged him to stop selling meth. In a blog post about that brilliant series, I labelled these viewers “bad fans,” and the responses I got made me feel as if I’d poured a bucket of oil onto a flame war from the parapets of my snobby critical castle. Truthfully, my haters had a point: who wants to hear that they’re watching something wrong?
·newyorker.com·
Fandom's Great Divide
Competition is overrated - cdixon
Competition is overrated - cdixon
That other people tried your idea without success could imply it’s a bad idea or simply that the timing or execution was wrong. Distinguishing between these cases is hard and where you should apply serious thought. If you think your competitors executed poorly, you should develop a theory of what they did wrong and how you’ll do better.
If you think your competitor’s timing was off, you should have a thesis about what’s changed to make now the right time. These changes could come in a variety of forms: for example, it could be that users have become more sophisticated, the prices of key inputs have dropped, or that prerequisite technologies have become widely adopted.
Startups are primarly competing against indifference, lack of awareness, and lack of understanding — not other startups.
There were probably 50 companies that tried to do viral video sharing before YouTube. Before 2005, when YouTube was founded, relatively few users had broadband and video cameras. YouTube also took advantage of the latest version of Flash that could play videos seamlessly.
Google and Facebook launched long after their competitors, but executed incredibly well and focused on the right things. When Google launched, other search engines like Yahoo, Excite, and Lycos were focused on becoming multipurpose “portals” and had de-prioritized search (Yahoo even outsourced their search technology).
·cdixon.org·
Competition is overrated - cdixon
Why Did I Leave Google Or, Why Did I Stay So Long? - LinkedIn
Why Did I Leave Google Or, Why Did I Stay So Long? - LinkedIn
If I had to summarize it, I would say that the signal to noise ratio is what wore me down. We start companies to build products that serve people, not to sit in meetings with lawyers.  You need to be able to answer the "what have I done for our users today" question with "not much but I got promoted" and be happy with that answer to be successful in Corp-Tech.
being part of a Corporation means that the signal to noise ratio changes dramatically.  The amount of time and effort spent on Legal, Policy, Privacy - on features that have not shipped to users yet, meant a significant waste of resources and focus. After the acquisition, we have an extremely long project that consumed many of our best engineers to align our data retention policies and tools to Google. I am not saying this is not important BUT this had zero value to our users. An ever increasing percent of our time went to non user value creation tasks and that changes the DNA of the company quickly, from customer focused to corporate guidelines focused.
the salaries are so high and the options so valuable that it creates many misalignments.  The impact of an individual product on the Corp-Tech stock is minimal so equity is basically free money.  Regardless of your performance (individually) or your product performance, you equity grows significantly so nothing you do has real economic impact on your family. The only control you have to increase your economic returns are whether you get promoted, since that drives your equity and salary payments.  This breaks the traditional tech model of risk reward.
·linkedin.com·
Why Did I Leave Google Or, Why Did I Stay So Long? - LinkedIn
Omegle's Rise and Fall - A Vision for Internet Connection
Omegle's Rise and Fall - A Vision for Internet Connection
As much as I wish circumstances were different, the stress and expense of this fight – coupled with the existing stress and expense of operating Omegle, and fighting its misuse – are simply too much. Operating Omegle is no longer sustainable, financially nor psychologically. Frankly, I don’t want to have a heart attack in my 30s. The battle for Omegle has been lost, but the war against the Internet rages on. Virtually every online communication service has been subject to the same kinds of attack as Omegle; and while some of them are much larger companies with much greater resources, they all have their breaking point somewhere. I worry that, unless the tide turns soon, the Internet I fell in love with may cease to exist, and in its place, we will have something closer to a souped-up version of TV – focused largely on passive consumption, with much less opportunity for active participation and genuine human connection.
I’ve done my best to weather the attacks, with the interests of Omegle’s users – and the broader principle – in mind. If something as simple as meeting random new people is forbidden, what’s next? That is far and away removed from anything that could be considered a reasonable compromise of the principle I outlined. Analogies are a limited tool, but a physical-world analogy might be shutting down Central Park because crime occurs there – or perhaps more provocatively, destroying the universe because it contains evil. A healthy, free society cannot endure when we are collectively afraid of each other to this extent.
In recent years, it seems like the whole world has become more ornery. Maybe that has something to do with the pandemic, or with political disagreements. Whatever the reason, people have become faster to attack, and slower to recognize each other’s shared humanity. One aspect of this has been a constant barrage of attacks on communication services, Omegle included, based on the behavior of a malicious subset of users. To an extent, it is reasonable to question the policies and practices of any place where crime has occurred. I have always welcomed constructive feedback; and indeed, Omegle implemented a number of improvements based on such feedback over the years. However, the recent attacks have felt anything but constructive. The only way to please these people is to stop offering the service. Sometimes they say so, explicitly and avowedly; other times, it can be inferred from their act of setting standards that are not humanly achievable. Either way, the net result is the same.
I didn’t really know what to expect when I launched Omegle. Would anyone even care about some Web site that an 18 year old kid made in his bedroom in his parents’ house in Vermont, with no marketing budget? But it became popular almost instantly after launch, and grew organically from there, reaching millions of daily users. I believe this had something to do with meeting new people being a basic human need, and with Omegle being among the best ways to fulfill that need. As the saying goes: “If you build a better mousetrap, the world will beat a path to your door.” Over the years, people have used Omegle to explore foreign cultures; to get advice about their lives from impartial third parties; and to help alleviate feelings of loneliness and isolation. I’ve even heard stories of soulmates meeting on Omegle, and getting married. Those are only some of the highlights. Unfortunately, there are also lowlights. Virtually every tool can be used for good or for evil, and that is especially true of communication tools, due to their innate flexibility. The telephone can be used to wish your grandmother “happy birthday”, but it can also be used to call in a bomb threat. There can be no honest accounting of Omegle without acknowledging that some people misused it, including to commit unspeakably heinous crimes.
As a young teenager, I couldn’t just waltz onto a college campus and tell a student: “Let’s debate moral philosophy!” I couldn’t walk up to a professor and say: “Tell me something interesting about microeconomics!” But online, I was able to meet those people, and have those conversations. I was also an avid Wikipedia editor; I contributed to open source software projects; and I often helped answer computer programming questions posed by people many years older than me. In short, the Internet opened the door to a much larger, more diverse, and more vibrant world than I would have otherwise been able to experience; and enabled me to be an active participant in, and contributor to, that world. All of this helped me to learn, and to grow into a more well-rounded person. Moreover, as a survivor of childhood rape, I was acutely aware that any time I interacted with someone in the physical world, I was risking my physical body. The Internet gave me a refuge from that fear. I was under no illusion that only good people used the Internet; but I knew that, if I said “no” to someone online, they couldn’t physically reach through the screen and hold a weapon to my head, or worse. I saw the miles of copper wires and fiber-optic cables between me and other people as a kind of shield – one that empowered me to be less isolated than my trauma and fear would have otherwise allowed.
·omegle.com·
Omegle's Rise and Fall - A Vision for Internet Connection
Why corporate America broke up with design
Why corporate America broke up with design
Design thinking alone doesn't determine market success, nor does it always transform business as expected.
There are a multitude of viable culprits behind this revenue drop. Robson himself pointed to the pandemic and tightened global budgets while arguing that “the widespread adoption of design thinking . . . has reduced demand for our services.” (Ideo was, in part, its own competition here since for years, it sold courses on design thinking.) It’s perhaps worth noting that, while design thinking was a buzzword from the ’90s to the early 2010s, it’s commonly met with all sorts of criticism today.
“People were like, ‘We did the process, why doesn’t our business transform?'” says Cliff Kuang, a UX designer and coauthor of User Friendly (and a former Fast Company editor). He points to PepsiCo, which in 2012 hired its first chief design officer and opened an in-house design studio. The investment has not yielded a string of blockbusters (and certainly no iPhone for soda). One widely promoted product, Drinkfinity, attempted to respond to diminishing soft-drink sales with K-Cup-style pods and a reusable water bottle. The design process was meticulous, with extensive prototyping and testing. But Drinkfinity had a short shelf life, discontinued within two years of its 2018 release.
“Design is rarely the thing that determines whether something succeeds in the market,” Kuang says. Take Amazon’s Kindle e-reader. “Jeff Bezos henpecked the original Kindle design to death. Because he didn’t believe in capacitive touch, he put a keyboard on it, and all this other stuff,” Kuang says. “Then the designer of the original Kindle walked and gave [the model] to Barnes & Noble.” Barnes & Noble released a product with a superior physical design, the Nook. But design was no match for distribution. According to the most recent data, Amazon owns approximately 80% of the e-book market share.
The rise of mobile computing has forced companies to create effortless user experiences—or risk getting left behind. When you hail an Uber or order toilet paper in a single click, you are reaping the benefits of carefully considered design. A 2018 McKinsey study found that companies with the strongest commitment to design and the best execution of design principles had revenue that was 32 percentage points higher—and shareholder returns that were 56 percentage points higher—than other companies.
·fastcompany.com·
Why corporate America broke up with design
Designing in Winter
Designing in Winter
As the construction industry matured, and best practices were commodified, the percentage of buildings requiring the direct involvement of architects plummeted. Builders can now choose from an array of standard layouts that cover most of their needs; materials and design questions, too, have been standardized, and reflect economies of scale more than local or unique contextual realities.
Cities have lots of rules and regulation about how things can be designed and built, reducing the need for and value of creativity
The situation is similar in our field. In 2009, companies might ask a designer to “imagine the shoe-shopping experience on mobile,” and such a designer would need to marshal a considerable number of skills to do so: research into how such activity happens today and how it had been attempted online before and the psychology of people engaged in it; explorations of many kinds of interfaces, since no one really knew yet how to present these kinds of information on smartphones; market investigations to determine e.g. “what % of prospective shoppers have which kinds of devices, and what designs can accommodate them all”; testing for raw usability: can people even figure out what to do when they see these screens? And so on.In 2023, the scene is very different. Best practices in most forms of software and services are commodified; we know, from a decade plus of market activity, what works for most people in a very broad range of contexts. Standardization is everywhere, and resources for the easy development of UIs abound.
It’s also the case that if a designer adds 15% to a design’s quality but increases cycle time substantially, is another cook in the kitchen, demands space for ideation or research, and so on, the trade-off will surely start to seem debatable to many leaders, and that’s ignoring FTE costs! We can be as offended by this as we want, but the truth is that the ten millionth B2B SaaS startup can probably validate or falsify product-market-fit without hiring Jony Ive and an entire team of specialists.
We design apps downstream of how Apple designs iOS. There’s just not that much room for innovating in UI at the moment
Today, for a larger-than-ever percentage of projects, some good libraries and guidelines like Apple’s HIG can get non-designers where they need to go. Many companies could probably do very well with1 designer to do native design + create and maintain a design systemPMs and executives for ideationFront-end engineers working off of the design system / component library to implement ideasSo even where commodification doesn’t mean no designers, it still probably means fewer designers.
If, for example, they land AR / VR, we will once again face a world of businesses who need to figure out how their goods and services make sense in a new context: how should we display Substack posts in AR, for example? Which metaphors should persist into the new world? What’s the best way to shop for shoes in VR? What affordances empower the greatest number of people?
But there will at least be another period when engineers who “just ship” will produce such massively worse user interfaces that software designers will be important again.
“design process” and “design cycles” are under pressure and may face much more soon. Speed helps, and so too does a general orientation towards working with production however it’s happening. This basically sums to: “Be less precious, and try to fit in in whatever ways help your company ship.”
being capable of more of the work of making software can mean becoming better at strategy and ideation, such that you’re ever executive’s favorite collaborative partner; you listen well, you mock fast (maybe with AI), and you help them communicate; or it can mean becoming better at execution, learning, for example, to code.
·suckstosuck.substack.com·
Designing in Winter
The genius behind Zelda is at the peak of his power — and feeling his age
The genius behind Zelda is at the peak of his power — and feeling his age
Aonuma became co-director of “Ocarina,” which revolutionized how game characters move and fight each other in a 3D space. Unlike cinema, video games require audience control of the camera. “Ocarina” created a “camera-locking” system to focus the perspective while you use the controller for character movement. The system, still used by games today, is a large reason “Ocarina” is often compared to the work of Orson Welles, who redefined how cinema was shot.
The “ethos of Zelda” focuses on such new, unexpected concepts of play — even as many other modern games prioritize story, like TV and film do. With “Tears,” at “the beginning of development, there really isn’t a story,” Fujibayashi said. “Once we got to the point where we felt confident in the gameplay experience, that’s when the story starts to emerge.”
·washingtonpost.com·
The genius behind Zelda is at the peak of his power — and feeling his age
Inside Rupert Murdoch’s Succession Drama | Vanity Fair
Inside Rupert Murdoch’s Succession Drama | Vanity Fair
Murdoch lobbied Trump to punish Facebook and Google for siphoning his newspapers’ advertising revenue. In 2019, Trump’s Justice Department launched an antitrust investigation of Google. In 2021, Google settled and struck a lucrative content-sharing deal with Murdoch. The source also said Murdoch pushed Trump to open up land for fracking to boost the value of Murdoch’s fossil fuel investments. The Trump administration released nearly 13 million acres of federally controlled land to fracking companies. Murdoch, who sources say has become more pro-life in recent years, encouraged Trump to appoint judges who would overturn Roe v. Wade. “Rupert wanted Trump’s Supreme Court justices in so they could make abortion illegal,” a source who spoke to Murdoch said.
·archive.is·
Inside Rupert Murdoch’s Succession Drama | Vanity Fair
Studio Branding in the Streaming Wars
Studio Branding in the Streaming Wars
The race for the streamers to configure themselves as full-service production, distribution, and exhibition outlets has intensified the need for each to articulate a more specific brand identity.
What we are seeing with the streaming wars is not the emergence of a cluster of copy-cat services, with everyone trying to do everything, but the beginnings of a legible strategy to carve up the mediascape and compete for peoples’ waking hours.
Netflix’s penchant for character-centered stories with a three-act structure, as well as high production values (an average of $20–$50-plus million for award contenders), resonates with the “quality” features of the Classical era.
rom early on, Netflix cultivated a liberal public image, which has propelled its investment in social documentary and also driven some of its inclusivity initiatives and collaborations with global auteurs and showrunners of color, such as Alfonso Cuarón, Ava DuVernay, Spike Lee, and Justin Simien.
Quibi as short for “Quick Bites.” In turn, the promos wouldn’t so much emphasize “the what” of the programming as the interest and convenience of being able to watch it while waiting, commuting, or just taking a break. However, this unit of prospective viewing time lies uncomfortably between the ultra-brief TikTok video and the half-hour sitcom.
Peacock’s central obstacle moving forward will be convincing would-be subscribers that the things they loved about linear broadcast and cable TV are worth the investment.
One of the most intriguing and revealing of metaphors, however, isn’t so much related to war as celestial coexistence of streamer-planets within the “universe.” Certainly, the term resonates with key franchises, such as the “Marvel Cinematic Universe,” and the bevvy of intricate stories that such an expansive environment makes possible. This language stakes a claim for the totality of media — that there are no other kinds of moving images beyond what exists on, or what can be imagined for, these select platforms.
·lareviewofbooks.org·
Studio Branding in the Streaming Wars
Netflix’s New Chapter
Netflix’s New Chapter
Blockbuster responded by pricing Blockbuster Online 50 cents cheaper, accelerating Netflix’s stock slide. Netflix, though, knew that Blockbuster was carrying $1 billion in debt from its spin-off from Viacom, and decided to wait it out; Blockbuster cut the price again, taking an increasing share of new subscribers, and still Netflix waited.
·stratechery.com·
Netflix’s New Chapter
‘Mad Max: Fury Road’: The Oral History of a Modern Action Classic - NY Times
‘Mad Max: Fury Road’: The Oral History of a Modern Action Classic - NY Times
“It was one of the wildest, most intense experiences of my life,” said the actress Riley Keough, while her co-star Rosie Huntington-Whiteley added, “You could have made another movie on the making of it.” As for Hardy? “It left me irrevocably changed,” he said.
COLIN GIBSON (production designer) I was in Namibia in 2003 when I got the call to stop spending money. I don’t know whether [the studio] decided to reroute their money back to the Iraq war, or if it was the email I got from Mel Gibson’s wife asking me how many Muslims there may or may not be in Namibia and, therefore, how interested she may or may not be in the whole family coming to visit.
MILLER I had the same feeling about Tom that I had when Mel Gibson first walked into the room: There was a kind of edgy charm, the charisma of animals. You don’t know what’s going on in their inner depths, and yet they’re enormously attractive.
KRAVITZ When they cast me, I was brought to a room that I wasn’t allowed to leave, and I sat there and read the script. It was one of the strangest scripts I’d ever seen, because it was like a really long comic book. JOHN SEALE (cinematographer) I couldn’t make head nor tail of it, so I gave up. I thought, “They’ve been in preproduction for 10 years, let’s just go make it.”
KRAVITZ We would do exercises like writing letters to our captor, really interesting stuff that created deep empathy. I’m glad we had that, because it was such a crazy experience — so long and chaotic — that it would be easy to forget what we were doing if we didn’t have this really great foundation that we could return to. KEOUGH I thought it was amazing that George cared so much. It could have just been like, “This is a big Hollywood movie, now put on your bathing suits and get outside.”
THERON The biggest thing that was driving that entire production was fear. I was incredibly scared, because I’d never done anything like it. I think the hardest thing between me and George is that he had the movie in his head and I was so desperate to understand it. SIXEL It was very difficult for the actors, because there’s no master shot, no blocked-out scenes. Their performances were made of these tiny little moments.
HUNTINGTON-WHITELEY There was a lot of tension, and a lot of different personalities and clashes at times. It was definitely interesting to sit in a truck for four months with Tom and Charlize, who have completely different approaches to their craft. HARDY Because of how much detail we were having to process and how little control one had in each new situation, and how fast the takes were — tiny snippets of story moments were needed to make the final cut work — we moved fast, and it was at times overwhelming. One had to trust that the bigger picture was being held together
THERON In retrospect, I didn’t have enough empathy to really, truly understand what he must have felt like to step into Mel Gibson’s shoes. That is frightening! And I think because of my own fear, we were putting up walls to protect ourselves instead of saying to each other, “This is scary for you, and it’s scary for me, too. Let’s be nice to each other.” In a weird way, we were functioning like our characters: Everything was about survival. HARDY I would agree. I think in hindsight, I was in over my head in many ways. The pressure on both of us was overwhelming at times. What she needed was a better, perhaps more experienced, partner in me. That’s something that can’t be faked. I’d like to think that now that I’m older and uglier, I could rise to that occasion.
KRAVITZ We were behind schedule, and we heard the studio was freaking out about how we were over budget. SEALE The president of Warner Bros. flew to Namibia and had a gold-plated fit. MILLER Jeff was in a bake-off with Kevin Tsujihara about who was going to head the studio, and he had to assert himself to show his superiors that he was in command and a strong executive. I knew what he was going through, but it wasn’t going to do anybody any good at all. [Robinov could not be reached for comment.] MITCHELL He said, “The camera will stop on Dec. 8, no matter what you’ve got, and that’s the end of it.” We hadn’t shot any of the scenes in the Citadel yet, where the opening and closing book ends of the film are set, and we had to go into postproduction without them. It was almost incomprehensible.
SIXEL When we actually finished the film and it was a success, that was the best year we ever had. We’d repeat the stories of making the film to each other over and over again: How did we get to the other side? We still kind of marvel at it.
MILLER Not for a moment did we think “Fury Road” would be anything like an Oscar movie. SIXEL Half the time, I thought I was going to get fired off the film, and then I win an Oscar! How about that? We were just disappointed that George didn’t win, but basically, they were all his Oscars in a way.
MILLER When the ideas that you start off with are then comprehended by an audience at large out there, that’s ultimately what redeems the process for you. The Swahili storytellers have this quote: “The story has been told. If it was bad, it was my fault, because I am the storyteller. But if it was good, it belongs to everybody.” And that feeling of the story belonging to everybody is really the reward
·nytimes.com·
‘Mad Max: Fury Road’: The Oral History of a Modern Action Classic - NY Times