Saved

Saved

important:1 #economy
Trump’s new economic war
Trump’s new economic war
Saudi Arabia and other producers must cut oil prices, global central banks “immediately” needed to slash interest rates, and foreign companies must ramp up investments in US factories or face tariffs. The EU — which came in for particular opprobrium — must stop hitting big American technology companies with competition fines.
Trump’s demands came amid a frenetic first week in office in which the president launched a blitzkrieg of executive orders and announcements intended not just to reshape the state but also assert America’s economic and commercial supremacy. Tariffs of up to 25 per cent could be slapped on Canada and Mexico as early as February 1, riding roughshod over the trade deal Trump himself negotiated in his first term.  China could face levies of up to 100 per cent if Beijing failed to agree on a deal to sell at least 50 per cent of the TikTok app to a US company, while the EU was told to purchase more American oil if it wanted to avoid tariffs. Underscoring the new American unilateralism, Trump pulled the US out of the World Health Organization, as well as exiting the Paris climate accord for a second time.
This proposal throws a “hand grenade” at international tax policymaking, says Niels Johannesen, director of the Oxford university Centre for Business Taxation at Saïd Business School. The move suggests a determination to “shape other countries’ tax policy through coercion rather than through co-operation”, he adds.
“Those around Trump have had time to build up a systematic, methodological approach for protectionist trade policy and it shows,” says former UK trade department official Allie Renison, now at consultancy SEC Newgate. The approach will be to build up a case file of “evidence” against countries, she says, and then use it to extract concessions in areas of both economic and foreign policy.
The question remains how far Trump is willing to go. The danger of trampling on the rules-based order, says Jeromin Zettelmeyer, head of the Bruegel think-tank, is a complete breakdown in the diplomatic and legal channels for settling international disputes. If Trump were to pull out of a wider range of international frameworks, such as the WTO or the IMF, he warns, then the arrangements that help govern the global economy could get “substantively destroyed”.
Some caution against being awestruck by Trump’s threats or his espousal of capitalism without limits, because his agenda was so incoherent. “What we are seeing is huge doses of American hubris,” says Arancha González, dean of the Paris School of International Affairs at Sciences Po. “We are blinded by the intensity of all the issues put on the table and by Trump’s conviction. But we are not looking at the contradictions. It’s like we are all on an orange drug
·archive.is·
Trump’s new economic war
The Cost-of-Living Crisis Explains Everything
The Cost-of-Living Crisis Explains Everything
headline economic figures have become less and less of a useful guide to how actual families are doing—something repeatedly noted by Democrats during the Obama recovery and the Trump years. Inequality may be declining, but it still skews GDP and income figures, with most gains going to the few, not the many. The obscene cost of health care saps family incomes and government coffers without making anyone feel healthier or wealthier.
To be clear, the headline economic numbers are strong. The gains are real. The reduction in inequality is tremendous, the pickup in wage growth astonishing, particularly if you anchor your expectations to the Barack Obama years, as many Biden staffers do.
During the Biden-Harris years, more granular data pointed to considerable strain. Real median household income fell relative to its pre-COVID peak. The poverty rate ticked up, as did the jobless rate. The number of Americans spending more than 30 percent of their income on rent climbed. The delinquency rate on credit cards surged, as did the share of families struggling to afford enough nutritious food, as did the rate of homelessness.
the White House never passed the permanent care-economy measures it had considered.
the biggest problem, one that voters talked about at any given opportunity, was the unaffordability of American life. The giant run-up in inflation during the Biden administration made everything feel expensive, and the sudden jump in the cost of small-ticket, common purchases (such as fast food and groceries) highlighted how bad the country’s long-standing large-ticket, sticky costs (health care, child care, and housing) had gotten. The cost-of-living crisis became the defining issue of the campaign, and one where the incumbent Democrats’ messaging felt false and weak.
Rather than acknowledging the pain and the trade-offs and the complexity—and rather than running a candidate who could have criticized Biden’s economic plans—Democrats dissembled. They noted that inflation was a global phenomenon, as if that mattered to moms in Ohio and machinists in the Central Valley. They pushed the headline numbers. They insisted that working-class voters were better off, and ran on the threat Trump posed to democracy and rights. But were working-class voters really better off? Why wasn’t anyone listening when they said they weren’t?
Voters do seem to be less likely to vote in their economic self-interest these days, and more likely to vote for a culturally compelling candidate. As my colleague Rogé Karma notes, lower-income white voters are flipping from the Democratic Party to the Republican Party on the basis of identitarian issues. The sharp movement of union voters to Trump seems to confirm the trend. At the same time, high-income voters are becoming bluer in order to vote their cosmopolitan values.
The Biden-Harris administration did make a difference in concrete, specific ways: It failed to address the cost-of-living catastrophe and had little to show for its infrastructure laws, even if it found a lot to talk about. And it dismissed voters who said they hated the pain they felt every time they had to open their wallet.
·theatlantic.com·
The Cost-of-Living Crisis Explains Everything
Inside the Collapse of Venture for America
Inside the Collapse of Venture for America
In the beginning, VFA was an institution beloved by many of its fellows. “It was a wonderful way to leave college and enter the real world because you’re surrounded by a community and there’s support from the organization,” says Jamie Norwood, co-founder of feminine hygiene brand Winx Health. Norwood and her co-founder, Cynthia Plotch, are a VFA success story. They met as fellows in 2015 and VFA eventually helped them launch their company with a grant and advisement. “We always say, Winx Health would not be here without VFA,” Norwood says.
Norwood and Plotch went through the standard VFA admissions protocol, which was rigorous. It required two written applications, a video interview, and in-person interviews at an event called “Selection Day,” many of which were held in New York City and Detroit over the years. By the end of each university term in May, accepted fellows would get access to Connect, VFA’s job portal, and have until November to land a job. For each fellow hired in a full-time job, VFA received a $5,000 placement fee, paid by partner companies. This fee became a crucial revenue stream for the organization—effectively wedding the professional success of its fellows to its bottom line.
Selection Day interviews were conducted by judges who often pitted interviewees against each other. Candidates were told to organize themselves in order of least to most likely to be successful, or according to whose answers had the most value per word. The format felt ruthless. “People cried” during the interview process, Plotch remembers.
The problems with the business bled into the fellows’ experience in 2023 and 2024, leaving them disenchanted, financially struggling, or expelled en masse from the program for reasons they believe were beyond their control. Despite a multitude of financial red flags, VFA leadership still insisted on recruiting for the 2024 class. “The talent team was traveling nonstop, using prepaid Visa cards since the corporate cards didn’t work,” explains a former director who worked closely with fellows.
Onboarding fresh recruits became increasingly crucial if VFA was going to survive. The organization asked companies for placement fees upfront in 2023, according to internal VFA documents and conversations with former employees. The policy change gave companies pause. Fewer companies signed up as partners, meaning fellows weren’t getting jobs and VFA was losing money.
In the spring of 2023, “there were 15 jobs on opening day,” for a class that eventually grew to over 100 fellows, the former director explains. Gabriella Rudnik, a 2023 fellow, estimates that when training camp began in July 2023, less than half of her peers had jobs, “whereas in previous years it would be closer to like 80 percent.”
Fellows were made to pay the price for the shortage of companies partnering with VFA in 2023. “We weren’t getting more jobs on Connect, and that’s what led to so many fellows being off-boarded,” explains a former director who worked closely with fellows.
Traditionally, VFA gave fellows a deadline of November of their class year to find a job, which typically meant a few stragglers were given extra help to find a position if they were late. In those rare cases during earlier years, fellows were offboarded by the organization, a former director says.
In previous years, expulsion was a much more serious and infrequent occurrence. “Removal from the fellowship was not something done lightly. During my tenure, we instituted an internal investigation process, similar to an HR investigation,” says the former executive who worked at VFA from 2017-20.  In total, at least 40 fellows from the 2023 class were expelled for failing to get jobs that weren’t available, according to research by former VFA fellows who tracked the number of fellows purged from a Slack channel. Records of their participation were removed from the VFA website, the fellows say.
Many fellows had made sacrifices to be part of the highly selective and prestigious VFA, which cited acceptance rates of around 10 percent of applicants. “There were fellows who turned down six-figure jobs to be a part of this program, and were told that the program that Andrew Yang started would live up to its reputation,” says Paul Ford, a 2024 fellow.
Though internal documents show that VFA was slowly imploding for months, in all external communications with fellows, the nonprofit still maintained that 2024 training camp would take place in Detroit.
“From an ethical perspective, it does reek of being problematic,” says Thad Calabrese, a professor of nonprofit management at New York University. “You entered into an arrangement with people who don’t have a lot of money, who believed that you were going to make them whole. Then you’re going to turn around and not make them whole.”
·archive.is·
Inside the Collapse of Venture for America
How Are You Auramaxxing?
How Are You Auramaxxing?
While readings like those that might come from an aura photographer had previously been done by assigning meanings to specific colors, a point system has emerged from teenagers on TikTok, where real-life interactions are gamified into gaining or losing “aura points.” There are celebrity compilations of “the worst aura moments of all time” and people posting concerns about losing aura because their boyfriend wants space. Taking care of your skin and taping your mouth while you sleep will gain you aura points. Same goes for developing an aesthetic, sitting in nature, making eye contact with women, using wired headphones, doing pull-ups, not speaking, and finding other friends with aura. While the aura math online is inconsistent and subjective (people usually debate the points in the comments), the general rule is you gain hundreds or thousands of points by doing something impressive, intriguing, charismatic, or authentic.
While learning how to auramaxx by preparing for conversations as an introvert and eating breakfast and working out in the morning may help some young men navigate life, and the mostly amusing aura point system may seem harmless, Derek Beres, author and co-host of Conspirituality, a podcast dismantling New Age cults, wellness grifters, and conspiracy-mad yogis, points out that auramaxxing content online is rife with misogyny. This comes as little surprise seeing as, according to Beres, wellness content online often has right-wing overlap, and looksmaxxing has roots within incel message boards. Aside from the obvious potential for harm toward women, Beres says that the pursuit of auramaxxing can also inevitably lead to guilt and shame for young men.
·thecut.com·
How Are You Auramaxxing?
AI Integration and Modularization
AI Integration and Modularization
Summary: The question of integration versus modularization in the context of AI, drawing on the work of economists Ronald Coase and Clayton Christensen. Google is pursuing a fully integrated approach similar to Apple, while AWS is betting on modularization, and Microsoft and Meta are somewhere in between. Integration may provide an advantage in the consumer market and for achieving AGI, but that for enterprise AI, a more modular approach leveraging data gravity and treating models as commodities may prevail. Ultimately, the biggest beneficiary of this dynamic could be Nvidia.
The left side of figure 5-1 indicates that when there is a performance gap — when product functionality and reliability are not yet good enough to address the needs of customers in a given tier of the market — companies must compete by making the best possible products. In the race to do this, firms that build their products around proprietary, interdependent architectures enjoy an important competitive advantage against competitors whose product architectures are modular, because the standardization inherent in modularity takes too many degrees of design freedom away from engineers, and they cannot not optimize performance.
The issue I have with this analysis of vertical integration — and this is exactly what I was taught at business school — is that the only considered costs are financial. But there are other, more difficult to quantify costs. Modularization incurs costs in the design and experience of using products that cannot be overcome, yet cannot be measured. Business buyers — and the analysts who study them — simply ignore them, but consumers don’t. Some consumers inherently know and value quality, look-and-feel, and attention to detail, and are willing to pay a premium that far exceeds the financial costs of being vertically integrated.
Google trains and runs its Gemini family of models on its own TPU processors, which are only available on Google’s cloud infrastructure. Developers can access Gemini through Vertex AI, Google’s fully-managed AI development platform; and, to the extent Vertex AI is similar to Google’s internal development environment, that is the platform on which Google is building its own consumer-facing AI apps. It’s all Google, from top-to-bottom, and there is evidence that this integration is paying off: Gemini 1.5’s industry leading 2 million token context window almost certainly required joint innovation between Google’s infrastructure team and its model-building team.
In AI, Google is pursuing an integrated strategy, building everything from chips to models to applications, similar to Apple's approach in smartphones.
On the other extreme is AWS, which doesn’t have any of its own models; instead its focus has been on its Bedrock managed development platform, which lets you use any model. Amazon’s other focus has been on developing its own chips, although the vast majority of its AI business runs on Nvidia GPUs.
Microsoft is in the middle, thanks to its close ties to OpenAI and its models. The company added Azure Models-as-a-Service last year, but its primary focus for both external customers and its own internal apps has been building on top of OpenAI’s GPT family of models; Microsoft has also launched its own chip for inference, but the vast majority of its workloads run on Nvidia.
Google is certainly building products for the consumer market, but those products are not devices; they are Internet services. And, as you might have noticed, the historical discussion didn’t really mention the Internet. Both Google and Meta, the two biggest winners of the Internet epoch, built their services on commodity hardware. Granted, those services scaled thanks to the deep infrastructure work undertaken by both companies, but even there Google’s more customized approach has been at least rivaled by Meta’s more open approach. What is notable is that both companies are integrating their models and their apps, as is OpenAI with ChatGPT.
Google's integrated AI strategy is unique but may not provide a sustainable advantage for Internet services in the way Apple's integration does for devices
It may be the case that selling hardware, which has to be perfect every year to justify a significant outlay of money by consumers, provides a much better incentive structure for maintaining excellence and execution than does being an Aggregator that users access for free.
Google’s collection of moonshots — from Waymo to Google Fiber to Nest to Project Wing to Verily to Project Loon (and the list goes on) — have mostly been science projects that have, for the most part, served to divert profits from Google Search away from shareholders. Waymo is probably the most interesting, but even if it succeeds, it is ultimately a car service rather far afield from Google’s mission statement “to organize the world’s information and make it universally accessible and useful.”
The only thing that drives meaningful shifts in platform marketshare are paradigm shifts, and while I doubt the v1 version of Pixie [Google’s rumored Pixel-only AI assistant] would be good enough to drive switching from iPhone users, there is at least a path to where it does exactly that.
the fact that Google is being mocked mercilessly for messed-up AI answers gets at why consumer-facing AI may be disruptive for the company: the reason why incumbents find it hard to respond to disruptive technologies is because they are, at least at the beginning, not good enough for the incumbent’s core offering. Time will tell if this gives more fuel to a shift in smartphone strategies, or makes the company more reticent.
while I was very impressed with Google’s enterprise pitch, which benefits from its integration with Google’s infrastructure without all of the overhead of potentially disrupting the company’s existing products, it’s going to be a heavy lift to overcome data gravity, i.e. the fact that many enterprise customers will simply find it easier to use AI services on the same clouds where they already store their data (Google does, of course, also support non-Gemini models and Nvidia GPUs for enterprise customers). To the extent Google wins in enterprise it may be by capturing the next generation of startups that are AI first and, by definition, data light; a new company has the freedom to base its decision on infrastructure and integration.
Amazon is certainly hoping that argument is correct: the company is operating as if everything in the AI value chain is modular and ultimately a commodity, which insinuates that it believes that data gravity will matter most. What is difficult to separate is to what extent this is the correct interpretation of the strategic landscape versus a convenient interpretation of the facts that happens to perfectly align with Amazon’s strengths and weaknesses, including infrastructure that is heavily optimized for commodity workloads.
Unclear if Amazon's strategy is based on true insight or motivated reasoning based on their existing strengths
Meta’s open source approach to Llama: the company is focused on products, which do benefit from integration, but there are also benefits that come from widespread usage, particularly in terms of optimization and complementary software. Open source accrues those benefits without imposing any incentives that detract from Meta’s product efforts (and don’t forget that Meta is receiving some portion of revenue from hyperscalers serving Llama models).
The iPhone maker, like Amazon, appears to be betting that AI will be a feature or an app; like Amazon, it’s not clear to what extent this is strategic foresight versus motivated reasoning.
achieving something approaching AGI, whatever that means, will require maximizing every efficiency and optimization, which rewards the integrated approach.
the most value will be derived from building platforms that treat models like processors, delivering performance improvements to developers who never need to know what is going on under the hood.
·stratechery.com·
AI Integration and Modularization
Shop Class as Soulcraft
Shop Class as Soulcraft

Summary: Skilled manual labor entails a systematic encounter with the material world that can enrich one's intellectual and spiritual life. The degradation of work in both blue-collar and white-collar professions is driven not just by technological progress, but by the separation of thinking from doing according to the dictates of capital. To realize the full potential of human flourishing, we must reckon with the appeal of skilled manual work and question the assumptions that shape our educational priorities and notions of a good life.

an engineering culture has developed in recent years in which the object is to “hide the works,” rendering the artifacts we use unintelligible to direct inspection. Lift the hood on some cars now (especially German ones), and the engine appears a bit like the shimmering, featureless obelisk that so enthralled the cavemen in the opening scene of the movie 2001: A Space Odyssey. Essentially, there is another hood under the hood.
What ordinary people once made, they buy; and what they once fixed for themselves, they replace entirely or hire an expert to repair, whose expert fix often involves installing a pre-made replacement part.
So perhaps the time is ripe for reconsideration of an ideal that has fallen out of favor: manual competence, and the stance it entails toward the built, material world. Neither as workers nor as consumers are we much called upon to exercise such competence, most of us anyway, and merely to recommend its cultivation is to risk the scorn of those who take themselves to be the most hard-headed: the hard-headed economist will point out the opportunity costs of making what can be bought, and the hard-headed educator will say that it is irresponsible to educate the young for the trades, which are somehow identified as the jobs of the past.
It was an experience of agency and competence. The effects of my work were visible for all to see, so my competence was real for others as well; it had a social currency. The well-founded pride of the tradesman is far from the gratuitous “self-esteem” that educators would impart to students, as though by magic.
Skilled manual labor entails a systematic encounter with the material world, precisely the kind of encounter that gives rise to natural science. From its earliest practice, craft knowledge has entailed knowledge of the “ways” of one’s materials — that is, knowledge of their nature, acquired through disciplined perception and a systematic approach to problems.
Because craftsmanship refers to objective standards that do not issue from the self and its desires, it poses a challenge to the ethic of consumerism, as the sociologist Richard Sennett has recently argued. The craftsman is proud of what he has made, and cherishes it, while the consumer discards things that are perfectly serviceable in his restless pursuit of the new.
The central culprit in Braverman’s account is “scientific management,” which “enters the workplace not as the representative of science, but as the representative of management masquerading in the trappings of science.” The tenets of scientific management were given their first and frankest articulation by Frederick Winslow Taylor
Scattered craft knowledge is concentrated in the hands of the employer, then doled out again to workers in the form of minute instructions needed to perform some part of what is now a work process. This process replaces what was previously an integral activity, rooted in craft tradition and experience, animated by the worker’s own mental image of, and intention toward, the finished product. Thus, according to Taylor, “All possible brain work should be removed from the shop and centered in the planning or lay-out department.” It is a mistake to suppose that the primary purpose of this partition is to render the work process more efficient. It may or may not result in extracting more value from a given unit of labor time. The concern is rather with labor cost. Once the cognitive aspects of the job are located in a separate management class, or better yet in a process that, once designed, requires no ongoing judgment or deliberation, skilled workers can be replaced with unskilled workers at a lower rate of pay.
the “jobs of the future” rhetoric surrounding the eagerness to end shop class and get every warm body into college, thence into a cubicle, implicitly assumes that we are heading to a “post-industrial” economy in which everyone will deal only in abstractions. Yet trafficking in abstractions is not the same as thinking. White collar professions, too, are subject to routinization and degradation, proceeding by the same process as befell manual fabrication a hundred years ago: the cognitive elements of the job are appropriated from professionals, instantiated in a system or process, and then handed back to a new class of workers — clerks — who replace the professionals. If genuine knowledge work is not growing but actually shrinking, because it is coming to be concentrated in an ever-smaller elite, this has implications for the vocational advice that students ought to receive.
The trades are then a natural home for anyone who would live by his own powers, free not only of deadening abstraction, but also of the insidious hopes and rising insecurities that seem to be endemic in our current economic life. This is the stoic ideal.
·thenewatlantis.com·
Shop Class as Soulcraft
My Last Five Years of Work
My Last Five Years of Work
Copywriting, tax preparation, customer service, and many other tasks are or will soon be heavily automated. I can see the beginnings in areas like software development and contract law. Generally, tasks that involve reading, analyzing, and synthesizing information, and then generating content based on it, seem ripe for replacement by language models.
Anyone who makes a living through  delicate and varied movements guided by situation specific know-how can expect to work for much longer than five more years. Thus, electricians, gardeners, plumbers, jewelry makers, hair stylists, as well as those who repair ironwork or make stained glass might find their handiwork contributing to our society for many more years to come
Finally, I expect there to be jobs where humans are preferred to AIs even if the AIs can do the job equally well, or perhaps even if they can do it better. This will apply to jobs where something is gained from the very fact that a human is doing it—likely because it involves the consumer feeling like they have a relationship with the human worker as a human. Jobs that might fall into this category include counselors, doulas, caretakers for the elderly, babysitters, preschool teachers, priests and religious leaders, even sex workers—much has been made of AI girlfriends, but I still expect that a large percentage of buyers of in-person sexual services will have a strong preference for humans. Some have called these jobs “nostalgic jobs.”
It does seem that, overall, unemployment makes people sadder, sicker, and more anxious. But it isn’t clear if this is an inherent fact of unemployment, or a contingent one. It is difficult to isolate the pure psychological effects of being unemployed, because at present these are confounded with the financial effects—if you lose your job, you have less money—which produce stress that would not exist in the context of, say, universal basic income. It is also confounded with the “shame” aspect of being fired or laid off—of not working when you really feel you should be working—as opposed to the context where essentially all workers have been displaced.
One study that gets around the “shame” confounder of unemployment is “A Forced Vacation? The Stress of Being Temporarily Laid Off During a Pandemic” by Scott Schieman, Quan Mai, and Ryu Won Kang. This study looked at Canadian workers who were temporarily laid off several months into the COVID-19 pandemic. They first assumed that such a disruption would increase psychological distress, but instead found that the self-reported wellbeing was more in line with the “forced vacation hypothesis,” suggesting that temporarily laid-off workers might initially experience lower distress due to the unique circumstances of the pandemic.
By May 2020, the distress gap observed in April had vanished, indicating that being temporarily laid off was not associated with higher distress during these months. The interviews revealed that many workers viewed being left without work as a “forced vacation,” appreciating the break from work-related stress and valuing the time for self-care and family. The widespread nature of layoffs normalized the experience, reducing personal blame and fostering a sense of shared experience. Financial strain was mitigated by government support, personal savings, and reduced spending, which buffered against potential distress.
The study suggests that the context and available support systems can significantly alter the psychological outcomes of unemployment—which seems promising for AGI-induced unemployment.
From the studies on plant closures and pandemic layoffs, it seems that shame plays a role in making people unhappy after unemployment, which implies that they might be happier in full automation-induced unemployment, since it would be near-universal and not signify any personal failing.
A final piece that reveals a societal-psychological aspect to how much work is deemed necessary is that the amount has changed over time! The number of hours that people have worked has declined over the past 150 years. Work hours tend to decline as a country gets richer. It seems odd to assume that the current accepted amount of work of roughly 40 hours a week is the optimal amount. The 8-hour work day, weekends, time off—hard-fought and won by the labor movement!—seem to have been triumphs for human health and well-being. Why should we assume that stopping here is right? Why should we assume that less work was better in the past, but less work now would be worse?
Removing the shame that accompanies unemployment by removing the sense that one ought to be working seems one way to make people happier during unemployment. Another is what they do with their free time. Regardless of how one enters unemployment, one still confronts empty and often unstructured time.
One paper, titled “Having Too Little or Too Much Time Is Linked to Lower Subjective Well-Being” by Marissa A. Sharif, Cassie Mogilner, and Hal E. Hershfield tried to explore whether it was possible to have “too much” leisure time.
The paper concluded that it is possible to have too little discretionary time, but also possible to have too much, and that moderate amounts of discretionary time seemed best for subjective well-being. More time could be better, or at least not meaningfully worse, provided it was spent on “social” or “productive” leisure activities. This suggests that how people fare psychologically with their post-AGI unemployment will depend heavily on how they use their time, not how much of it there is
Automation-induced unemployment could feel like retiring depending on how total it is. If essentially no one is working, and no one feels like they should be working, it might be more akin to retirement, in that it would lack the shameful element of feeling set apart from one’s peers.
Women provide another view on whether formal work is good for happiness. Women are, for the most part, relatively recent entrants to the formal labor market. In the U.S., 18% of women were in the formal labor force in 1890. In 2016, 57% were. Has labor force participation made them happier? By some accounts: no. A paper that looked at subjective well-being for U.S. women from the General Social Survey between the 1970s and 2000s—a time when labor force participation was climbing—found both relative and absolute declines in female happiness.
I think women’s work and AI is a relatively optimistic story. Women have been able to automate unpleasant tasks via technological advances, while the more meaningful aspects of their work seem less likely to be automated away.  When not participating in the formal labor market, women overwhelmingly fill their time with childcare and housework. The time needed to do housework has declined over time due to tools like washing machines, dryers, and dishwashers. These tools might serve as early analogous examples of the future effects of AI: reducing unwanted and burdensome work to free up time for other tasks deemed more necessary or enjoyable.
it seems less likely that AIs will so thoroughly automate childcare and child-rearing because this “work” is so much more about the relationship between the parties involved. Like therapy, childcare and teaching seems likely to be one of the forms of work where a preference for a human worker will persist the longest.
In the early modern era, landed gentry and similar were essentially unemployed. Perhaps they did some minor administration of their tenants, some dabbled in politics or were dragged into military projects, but compared to most formal workers they seem to have worked relatively few hours. They filled the remainder of their time with intricate social rituals like balls and parties, hobbies like hunting, studying literature, and philosophy, producing and consuming art, writing letters, and spending time with friends and family. We don’t have much real well-being survey data from this group, but, hedonically, they seem to have been fine. Perhaps they suffered from some ennui, but if we were informed that the great mass of humanity was going to enter their position, I don’t think people would be particularly worried.
I sometimes wonder if there is some implicit classism in people’s worries about unemployment: the rich will know how to use their time well, but the poor will need to be kept busy.
Although a trained therapist might be able to counsel my friends or family through their troubles better, I still do it, because there is value in me being the one to do so. We can think of this as the relational reason for doing something others can do better. I write because sometimes I enjoy it, and sometimes I think it betters me. I know others do so better, but I don’t care—at least not all the time. The reasons for this are part hedonic and part virtue or morality.  A renowned AI researcher once told me that he is practicing for post-AGI by taking up activities that he is not particularly good at: jiu-jitsu, surfing, and so on, and savoring the doing even without excellence. This is how we can prepare for our future where we will have to do things from joy rather than need, where we will no longer be the best at them, but will still have to choose how to fill our days.
·palladiummag.com·
My Last Five Years of Work
How McKinsey Destroyed the Middle Class - The Atlantic
How McKinsey Destroyed the Middle Class - The Atlantic

The rise of management consulting firms like McKinsey played a pivotal role in disempowering the American middle class by promoting corporate restructuring that concentrated power and wealth in the hands of elite managers while stripping middle managers and workers of their decision-making roles, job security, and opportunities for career advancement.

Key topics:

  • Management consulting's role in reshaping corporate America
  • The decline of the middle class and the rise of corporate elitism
  • McKinsey's influence on corporate restructuring and inequality
  • The shift from lifetime employment to precarious jobs
  • The erosion of corporate social responsibility
  • The role of management consulting in perpetuating economic inequality
what consequences has the rise of management consulting had for the organization of American business and the lives of American workers? The answers to these questions put management consultants at the epicenter of economic inequality and the destruction of the American middle class.
Managers do not produce goods or deliver services. Instead, they plan what goods and services a company will provide, and they coordinate the production workers who make the output. Because complex goods and services require much planning and coordination, management (even though it is only indirectly productive) adds a great deal of value. And managers as a class capture much of this value as pay. This makes the question of who gets to be a manager extremely consequential.
In the middle of the last century, management saturated American corporations. Every worker, from the CEO down to production personnel, served partly as a manager, participating in planning and coordination along an unbroken continuum in which each job closely resembled its nearest neighbor.
Even production workers became, on account of lifetime employment and workplace training, functionally the lowest-level managers. They were charged with planning and coordinating the development of their own skills to serve the long-run interests of their employers.
At McDonald’s, Ed Rensi worked his way up from flipping burgers in the 1960s to become CEO. More broadly, a 1952 report by Fortune magazine found that two-thirds of senior executives had more than 20 years’ service at their current companies.
Top executives enjoyed commensurately less control and captured lower incomes. This democratic approach to management compressed the distribution of income and status. In fact, a mid-century study of General Motors published in the Harvard Business Review—completed, in a portent of what was to come, by McKinsey’s Arch Patton—found that from 1939 to 1950, hourly workers’ wages rose roughly three times faster than elite executives’ pay. The management function’s wide diffusion throughout the workforce substantially built the mid-century middle class.
The earliest consultants were engineers who advised factory owners on measuring and improving efficiency at the complex factories required for industrial production. The then-leading firm, Booz Allen, did not achieve annual revenues of $2 million until after the Second World War. McKinsey, which didn’t hire its first Harvard M.B.A. until 1953, retained a diffident and traditional ethos
A new ideal of shareholder primacy, powerfully championed by Milton Friedman in a 1970 New York Times Magazine article entitled “The Social Responsibility of Business is to Increase its Profits,” gave the newly ambitious management consultants a guiding purpose. According to this ideal, in language eventually adopted by the Business Roundtable, “the paramount duty of management and of boards of directors is to the corporation’s stockholders.” During the 1970s, and accelerating into the ’80s and ’90s, the upgraded management consultants pursued this duty by expressly and relentlessly taking aim at the middle managers who had dominated mid-century firms, and whose wages weighed down the bottom line.
Management consultants thus implemented and rationalized a transformation in the American corporation. Companies that had long affirmed express “no layoff” policies now took aim at what the corporate raider Carl Icahn, writing in the The New York Times in the late 1980s, called “corporate bureaucracies” run by “incompetent” and “inbred” middle managers. They downsized in response not to particular business problems but rather to a new managerial ethos and methods; they downsized when profitable as well as when struggling, and during booms as well as busts.
Downsizing was indeed wrenching. When IBM abandoned lifetime employment in the 1990s, local officials asked gun-shop owners around its headquarters to close their stores while employees absorbed the shock.
In some cases, downsized employees have been hired back as subcontractors, with no long-term claim on the companies and no role in running them. When IBM laid off masses of workers in the 1990s, for example, it hired back one in five as consultants. Other corporations were built from scratch on a subcontracting model. The clothing brand United Colors of Benetton has only 1,500 employees but uses 25,000 workers through subcontractors.
Shift from lifetime employment to reliance on outsourced labor; decline in unions
The shift from permanent to precarious jobs continues apace. Buttigieg’s work at McKinsey included an engagement for Blue Cross Blue Shield of Michigan, during a period when it considered cutting up to 1,000 jobs (or 10 percent of its workforce). And the gig economy is just a high-tech generalization of the sub-contractor model. Uber is a more extreme Benetton; it deprives drivers of any role in planning and coordination, and it has literally no corporate hierarchy through which drivers can rise up to join management.
In effect, management consulting is a tool that allows corporations to replace lifetime employees with short-term, part-time, and even subcontracted workers, hired under ever more tightly controlled arrangements, who sell particular skills and even specified outputs, and who manage nothing at all.
the managerial control stripped from middle managers and production workers has been concentrated in a narrow cadre of executives who monopolize planning and coordination. Mid-century, democratic management empowered ordinary workers and disempowered elite executives, so that a bad CEO could do little to harm a company and a good one little to help it.
Whereas at mid-century a typical large-company CEO made 20 times a production worker’s income, today’s CEOs make nearly 300 times as much. In a recent year, the five highest-paid employees of the S&P 1500 (7,500 elite executives overall), obtained income equal to about 10 percent of the total profits of the entire S&P 1500.
as Kiechel put it dryly, “we are not all in this together; some pigs are smarter than other pigs and deserve more money.” Consultants seek, in this way, to legitimate both the job cuts and the explosion of elite pay. Properly understood, the corporate reorganizations were, then, not merely technocratic but ideological.
corporate reorganizations have deprived companies of an internal supply of managerial workers. When restructurings eradicated workplace training and purged the middle rungs of the corporate ladder, they also forced companies to look beyond their walls for managerial talent—to elite colleges, business schools, and (of course) to management-consulting firms. That is to say: The administrative techniques that management consultants invented created a huge demand for precisely the services that the consultants supply.
Consulting, like law school, is an all-purpose status giver—“low in risk and high in reward,” according to the Harvard Crimson. McKinsey also hopes that its meritocratic excellence will legitimate its activities in the eyes of the broader world. Management consulting, Kiechel observed, acquired its power and authority not from “silver-haired industry experience but rather from the brilliance of its ideas and the obvious candlepower of the people explaining them, even if those people were twenty-eight years old.”
A deeper objection to Buttigieg’s association with McKinsey concerns not whom the firm represents but the central role the consulting revolution has played in fueling the enormous economic inequalities that now threaten to turn the United States into a caste society.
Meritocrats like Buttigieg changed not just corporate strategies but also corporate values.
GM may aspire to build good cars; IBM, to make typewriters, computers, and other business machines; and AT&T, to improve communications. Executives who rose up through these companies, on the mid-century model, were embedded in their firms and embraced these values, so that they might even have come to view profits as a salutary side effect of running their businesses well.
When management consulting untethered executives from particular industries or firms and tied them instead to management in general, it also led them to embrace the one thing common to all corporations: making money for shareholders. Executives raised on the new, untethered model of management aim exclusively and directly at profit: their education, their career arc, and their professional role conspire to isolate them from other workers and train them single-mindedly on the bottom line.
American democracy, the left believes, cannot be rejuvenated by persuading elites to deploy their excessive power somehow more benevolently. Instead, it requires breaking the stranglehold that elites have on our economics and politics, and reempowering everyone else.
·archive.is·
How McKinsey Destroyed the Middle Class - The Atlantic
Monopoly by the Numbers — Open Markets Institute
Monopoly by the Numbers — Open Markets Institute
Antitrust laws have not been effectively enforced or applied to specific market realities.
A generation ago, small, independent operations defined the entire industry. Today, the businesses of beef, pork, and poultry slaughter are all dominated by four giants at the national level. But that greatly understates the problem, as in many regions, a single corporation holds a complete monopoly. Two firms, Dean Foods and the Dairy Farmers of America control as much as 80-90 percent of the milk supply chain in some states and wield substantial influence across the entire industry. As our Food & Power website details, the story is much the same in food-processing, egg production, grain production, and produce farming.
Monopolists have captured control over many lines of manufacturing as well. Corning, an American glass manufacturer, sells 60 percent of all the glass used in LCD screens, and Owens Illinois holds a near monopoly over market for glass bottles in the US. Rexam, a British company, holds a dominant position over the international supply of bottle caps and pharmaceutical bottles.
Hospital corporations across America have also been buying up physician practices in recent years. Hospital ownership of physician practices more than doubled between 2004 and 2011, from 24 to 49 percent. In drug stores, meanwhile, the pending takeover of Rite Aid by Walgreen’s would reduce the market to two giants, along with CVS.
Pharmaceutical companies have been merging at a record pace in recent years, and drug makers often use their concentrated market power to raise the prices of generic drugs, such as Digoxin, Daraprim, Naloxone, and standard vaccines.
Whirlpool’s takeover of Maytag in 2006 gave it control of 50 to 80 percent of U.S. sales of washing machines, dryers, and dishwashers and a very strong position in refrigerators. Maytag also controls the Jenn-Air, Amana, Magic Chef, Admiral, and KitchenAid brands and holds a dominant position over supply of Sears Kenmore products.
The FTC successfully blocked a proposed merger of Staples and Office Depot, but the market is still highly concentrated after Office Depot’s 2013 acquisition of Office Max. Collectively, the two firms control 69 percent of the entire office supplies market.
China’s vitamin cartel controls 100 percent of the market for U.S. Vitamin C, which is also known as ascorbic acid and which is used in almost all preserved foods.
·openmarketsinstitute.org·
Monopoly by the Numbers — Open Markets Institute
Landlord Forced To Raise Rent Due To Thinking Of Bigger Number
Landlord Forced To Raise Rent Due To Thinking Of Bigger Number
“You can re-sign your lease, but I have to raise it by $250 a month because I realized there was a bigger number your rent could be,” Turley informed his tenant of six years, attributing the rent increase to a need to keep up with the rising numbers he could envision. “Look, you’re a good tenant and all, but I thought of a bigger number and that puts me in a bind. Did you know that $2,850 is larger than $2,600? Most tenants don’t concern themselves with things like that, but as a landlord, I always have to be adjusting my leasing requirements to match the larger amounts that happen to come to mind.”
·theonion.com·
Landlord Forced To Raise Rent Due To Thinking Of Bigger Number
Entrepreneurs Aren’t a Special Breed – They’re Mostly Rich Kids | Hacker News
Entrepreneurs Aren’t a Special Breed – They’re Mostly Rich Kids | Hacker News
Entrepreneurship is like one of those carnival games where you throw darts or something.Middle class kids can afford one throw. Most miss. A few hit the target and get a small prize. A very few hit the center bullseye and get a bigger prize. Rags to riches! The American Dream lives on.Rich kids can afford many throws. If they want to, they can try over and over and over again until they hit something and feel good about themselves. Some keep going until they hit the center bullseye, then they give speeches or write blog posts about "meritocracy" and the salutary effects of hard work.Poor kids aren't visiting the carnival. They're the ones working it.
·news.ycombinator.com·
Entrepreneurs Aren’t a Special Breed – They’re Mostly Rich Kids | Hacker News
The Umami Theory of Value
The Umami Theory of Value
a global pandemic struck, markets crashed, and the possibility of a democratic socialist presidency in America started to fade. Much of our work with clients has been about how to address new audiences in a time of massive fragmentation and the collapse of consensus reality.
All the while, people have been eager to launch new products more focused on impressions than materiality, and “spending on experiences” has become the standard of premium consumption.
it’s time to reassess the consumer experience that came along with the neoliberal fantasy of “unlimited” movement of people, goods and ideas around the globe.
Umami, as both a quality and effect of an experience, popped up primarily in settings that were on the verge of disintegration, and hinged on physical pilgrimages to evanescent meccas. We also believe that the experience economy is dying, its key commodity (umami) has changed status, and nobody knows what’s coming next.
Umami was the quality of the media mix or the moodboard that granted it cohesion-despite-heterogeneity. Umami was also the proximity of people on Emily’s museum panel, all women who are mostly not old, mostly not straight, and mostly doing something interesting in the arts, but we didn’t know exactly what. It was the conversation-dance experience and the poet’s play and the alt-electronica-diva’s first foray into another discipline. It was the X-factor that made a certain MA-1 worth 100x as much as its identical twin.
“Advanced consumers” became obsessed with umami and then ran around trying to collect ever-more-intensifying experiences of it. Things were getting more and more delicious, more and more expensive, and all the while, more and more immaterial. Umami is what you got when you didn’t get anything.
What was actually happening was the enrichment of financial assets over the creation of any ‘real wealth’ along with corresponding illusions of progress. As very little of this newly minted money has been invested into building new productive capacity, infrastructure, or actually new things, money has just been sloshing around in a frothy cesspool – from WeWork to Juicero to ill-advised real estate Ponzi to DTC insanity, creating a global everything-bubble.
Value, in an economic sense, is theoretically created by new things based on new ideas. But when the material basis for these new things is missing or actively deteriorating and profits must be made, what is there to be done? Retreat to the immaterial and work with what already exists: meaning. Meaning is always readily available to be repeated, remixed, and/or cannibalized in service of creating the sensation of the new.
The essential mechanics are simple: it’s stating there’s a there-there when there isn’t one. And directing attention to a new “there” before anyone notices they were staring at a void. It’s the logic of gentrification, not only of the city, but also the self, culture and civilization itself. What’s made us so gullible, and this whole process possible, was an inexhaustible appetite for umami.
eyond its synergistic effect, umami has a few other sensory effects that are relevant to our theory. For one, it creates the sense of thickness and body in food. (“Umami adds body…If you add it to a soup, it makes the soup seem like it’s thicker – it gives it sensory heft. It turns a soup from salt water into a food.”) For another, it’s released when foods break down into parts. (“When organic matter breaks down, the glutamate molecule breaks apart. This can happen on a stove when you cook meat, over time when you age a parmesan cheese, by fermentation as in soy sauce or under the sun as a tomato ripens. When glutamate becomes L-glutamate, that’s when things get “delicious.””) These three qualities: SYNERGY, IMPRESSION OF THICKNESS, and PARTS > WHOLE, are common to cultural umami, as well.
Umami hunting was a way for the West to consume an exotic, ethnic, global “taste” that was also invisible and up to their decoding / articulation.
when something is correctly salted, Chang argues, it tastes both over and undersalted at once. As a strange loop, this saltiness makes you stand back and regard your food; you start thinking about “the system it represents and your response to it”. He argues that this meta-regard keeps you in the moment and connected to the deliciousness of your food. We counter that it intensifies a moment in a flow, temporarily thickening your experience without keeping you anywhere for long.
strong flavors, namely umami, mark a surge of intensity in the flow of experience. It also becomes clear that paradox itself is at the heart of contemporary consumption. For example: “This shouldn’t be good but it is” “This doesn’t seem like what it’s supposed to be” “This is both too much and not enough” “I shouldn’t be here but i am” “This could be anywhere but it’s here”
Parts > Whole is just another way of saying a combination of things has emergent properties. In itself this doesn’t mean much, as almost any combination of things has emergent properties, especially in the domains of taste and culture. Coffee + vinegar is worse than its constitutive parts. A suit + sneakers is a greater kind of corny than either worn separately. Most emergence is trivial. The Umami Theory of Value centers on losing your sense of what’s trivial and what’s valuable.
If you tried to unpack your intuition, the absence of the there-there would quickly become evident. Yet in practice this didn’t matter, because few people were able to reach this kind of deep self-interrogation. The cycle was simply too fast. There was never time for these concoctions to congeal into actual new things (e.g. create the general category of K-Pop patrons for Central European arts institutions). We can’t be sure if they ever meant anything beyond seeming yummy at the time.
This was not meant to be a nihilistic, Gen-X faceplant (“nothing means anything any more”), since we think that perspective can paper over the nuances of consumer experience, business realities, and cultural crisis. Instead, we wanted to link macroeconomic and macrotrend observations to everyday experience, especially in the context of burgeoning collapse.
·nemesis.global·
The Umami Theory of Value
Pick a Practical Major, Like French
Pick a Practical Major, Like French
Mandarin and other Chinese languages and dialects have been considered serious, practical majors for some time because of the potential professional value of speaking in China. But why would the ability to speak in Francophone Africa be less valuable, unless you think Africa will never produce economic muscle to match its population?
We have a prevalent concept of the “practical college major” in our society, but that concept is vague, not buttressed with evidence, and shifts according to whim and prejudice. And the ultimate point of stressing the practicality of certain majors while denigrating the frivolity of others is to blame people for economic conditions they can’t control.
In the 2000s and 2010s, dozens of new schools of pharmacy were opened thanks to the perception that pharmacy was a safe field for young graduates. Thousands of newly minted pharmacists flooded the market. Somehow, administrators in higher education were surprised to find that these new graduates had a harder time finding a good job than previous generations. But this is an inevitable outcome of telling young people an academic field is a practical choice, since you’re making that field more attractive and thus increasing the competition they have to face in the labor market.
programming, like all skills, is subject to the simple constraints of supply and demand, and thus the practicality of studying the major is a moving target.
I have never — never — found a consistent and coherent definition of a “practical major,” anywhere. The meaning of the term floats around depending on the whims of the person using it, and those whims are usually dependent on mockery. The entire concept seems to exist simply to serve as an instrument to blame people for their own economic misfortune.
Some will say that a practical major is one that gives you the best opportunity for secure employment. Setting aside the fact that life spent in singular pursuit of money is soul-deadening, this strikes me as great advice for people in late adolescence who are in possession of a time machine. For the rest of us, perhaps we should build a society where the educational path chosen early in life is less consequential for lifetime economic security, and where we’re all more free to study what we actually care about.
Technology can change the economy faster than any person can reasonably be expected to keep up with. Nobody knows for sure which fields might be disrupted by AI, which skills rendered unmarketable. But if the effects are as big as some predict, a lot of people are suddenly going to find their once-practical path has become fraught and unsustainable. The question is, are we callous enough to blame them for it?
·nymag.com·
Pick a Practical Major, Like French
Max Pain (A Recent History)
Max Pain (A Recent History)
In The Umami Theory of Value, the authors discussed how entities create illusory value without improving material conditions. In 2020, they predicted a repulsive turn and a violent recoupling of value and material reality. However, the surreal crescendo of decoupling between value and reality that followed, which peaked in late 2021, saw incredible returns on random things and mainstreaming of risk. This period, which the authors call Clown Town, saw people taking risks they barely believed in and mistaking risk for opportunity. The authors then discuss the current era, Max Pain, in which everyone's opinion is right at some point, but never at the right time, and those who control the flows of information and capital are able to systematically profit while regular people struggle.
Money became increasingly fake-seeming as it diverged more and more from a hard day’s work and most conventional wisdom.
The growing number of people taking chances that they barely believed in (starting an Onlyfans, going all in on a memecoin, becoming a performative racist for clicks) reflected a rational response to seeing absurd and/or conventionally shitty ideas have outsized success (Bored Apes, Trump, the Babyccino).
bucking conventional wisdom in any direction became the order of the day. Contrarianism became incredibly popular. Taking the diametrically opposed position to consensus as a shortcut to standing out in a crowded and volatile field was a key Clown Town strategy.
As a subset of contrarianism, Hot Sauce Behavior became especially popular. Hot Sauce involves taking something basic or mid and applying a socially forbidden or mysterious spice to it (in place of, or to function as, the X factor or the je ne sais quoi). This element had to be shocking, bad, atavistic, or otherwise “not normal”—it could be Nazism, grooming, the Occult, Catholicism, outright aggression, the threat of violence, or the attitudes of obscure-to-you political groups—but in smallish amounts. It made peoples’ hearts race and adrenaline pump while they consumed something otherwise bland. (This was the Tension Economy as the new Attention Economy.)
If the 2020 degen was a gambler willing to go all in on a whim… …the 2023 degen is a sophisticated risk manager We have found ourselves in a new cultural era in which multiple overlapping crises and rising interest rates have led to an emergent reckoning. It is now widely understood that it was very stupid to play crazy games with tons of excess money instead of actually improving material reality. But certain questions remain: What the fuck is anything worth today? What’s the best way to manage risk while it all comes falling down?
In chess, today’s average player is more skilled than the one from yesteryear because online exposure of advanced theory has led to regular players making the moves of masters. As Virgil once said, “One kid does a new skateboard trick, then hundreds more can do it the next day around the world.”
Everyone should be able to use their increased intelligence and awareness to better navigate the world. In reality, the irony is painful: When everyone gets smarter, things get harder. If everyone is reassessing the most-effective-tactics-available all the time, it gets harder and harder to win, even though you’re smarter and “should be in a better position.” The Yale admissions office realizes thousands of applicants have watched the same obscure how-to-get-into-Yale TikTok, and decides to change the meta: Leadership is no longer a valuable quality.
Max Pain means, even when you’re right, you’re wrong; it describes a climate in which everyone’s opinion is right at some point, but never at the right time.
·nemesis.global·
Max Pain (A Recent History)
Dirt: Coping with things
Dirt: Coping with things
Coping with things is the prevailing mood in my corner of the universe. As I write this, America has just completed an election in which many people voted primarily for the idea of voting. The prevailing candidate? Less an individual than an avatar of civility and liberalism.
We are a country founded on an idea and not an identity.
Americans have a way of obscuring reality through grand symbolism and none of the accompanying semiotic rigor. As if the facade of democracy can be upheld by not looking too closely at increasingly undemocratic outcomes — our high tolerance for multiculturalism tenuously predicated on everyone struggling equally. The difference between idea and identity is both our saving grace and our downfall. Democracy: watch the gap.
The idea of the American individual, part of the national optimism that fueled the Space Race, is far less prominent than the citizen-consumer. Attaining a degree of celebrity, still a coveted means to financial stability, thrusts one into the category of “celebrity,” where image overtakes personhood.
Lifestyle, like work, is something we can only see in aggregate. Technological gains don’t relieve the pressure for ownership; they merely reinforce it.
·dirt.substack.com·
Dirt: Coping with things
Can a universal basic income help address homelessness? | Hacker News Discussion
Can a universal basic income help address homelessness? | Hacker News Discussion
The number one thing UBI doesn't handle well is rent inflation. You hand out $1000 dollars per person monthly, expect rents to go up by about $1000 monthly as landlords realise there is all this extra disposable income in peoples' hand right now.However, this is just an exaggerated effect of monopolies sucking out all aggregate disposable income out of economy that is already happening. Monopolies by definition don't have price down pressures, so they always price expand to capture anything people might have extra. Since landlording is the biggest aggregate monopoly in the world, landlords capture any disposable workers' income. No matter if they get a raise from their boss, the landlord always takes it away.
One of the biggest strengths of UBI is that it eliminates the beurocracy and waste associated with determining who "deserves" assistance. The dominant model in the US is expecting homeless people with drug problems to solve both their addiction and homelessness at the same time by themselves before they are deemed worthy of being helped, which needless to say is barely assistance at all. Having a gaurenteed income stream would make it easier to gain a foothold.
·news.ycombinator.com·
Can a universal basic income help address homelessness? | Hacker News Discussion
The Single Most Important Thing to Know About Financial Aid: It’s a Sham
The Single Most Important Thing to Know About Financial Aid: It’s a Sham
The whole public-facing system of college admissions—in which admissions decisions are based on rigorous academic standards and financial aid is supposedly provided to those who are most academically and financially deserving—is an elaborate stage play meant to flatter privileged families and the reputations of colleges themselves. The real system, hidden behind the scenery, is much closer to the mechanics of pure capitalism, driven by an industry of for-profit consultants and relentlessly focused on the institutional bottom line.
A spokesman from Clark University, which tried to entice Ethan with a “$68,000 Robert Goddard Achievement Scholarship,” told me that the school “does not rely on an enrollment management consultant.” Instead, they said, it “occasionally” hires “outside analytical support” that does “not tell us how much aid to offer any student or group of students” but does “crunch large volumes of data in a timely manner that we then use to assess our progress toward our enrollment goals and estimate/project our total aid expenditure through that enrollment cycle.”
So, not an enrollment management consultant. Just, you know, a consultant that helps them manage enrollment.
As DiFeliciantonio wrote: “Wealthy families are more able and less willing to pay for college while the poorer families are more willing and less able.” In other words, parents of means who themselves have finished college are often sophisticated consumers of higher education and are able to drive a hard bargain, whereas lower-income, less-educated parents feel an enormous obligation to help their children move farther up the socioeconomic ladder and blindly trust that colleges have their best financial interests at heart. So colleges obey the algorithm and offer more financial aid to the Ethans than to the Ashleys, one of many problems identified in a recent Brookings Institution report.
Ashley submitted financial aid forms with information about her family’s modest income because everyone and everything about the process told her college aid is based on how much money you need, or deserve. She had no idea that information could be used against her. In May, New York University offered her admission if she would agree to delay enrollment until spring 2023—when, maybe not coincidentally, her good-but-not-stellar academic record would not count in the rankings data NYU submits to U.S. News & World Report. Their price? $79,070. Their aid offer? $0, take it or leave it, with 96 hours to respond.
as the countless individual stories that compose the nation’s $1.7 trillion student loan crisis show, many families make different choices. They are drawn in by a combination of optimism, blind faith, and familial obligation, and end up with debts they cannot repay. Colleges know this will happen.
Nobody is really judging your worthiness for financial aid. College is just another service with a price.
·slate.com·
The Single Most Important Thing to Know About Financial Aid: It’s a Sham