CNet’s AI-Powered Search-Optimized Money Machine — Pixel Envy
Google vs. ChatGPT vs. Bing, Maybe — Pixel Envy
People are not interested in visiting websites about a topic; they, by and large, just want answers to their questions. Google has been strip-mining the web for years, leveraging its unique position as the world’s most popular website and its de facto directory to replace what made it great with what allows it to retain its dominance.
Artificial intelligence — or some simulation of it — really does make things better for searchers, and I bet it could reduce some tired search optimization tactics. But it comes at the cost of making us all into uncompensated producers for the benefit of trillion-dollar companies like Google and Microsoft.
Search optimization experts have spent years in an adversarial relationship with Google in an attempt to get their clients’ pages to the coveted first page of results, often through means which make results worse for searchers. Artificial intelligence is, it seems, a way out of this mess — but the compromise is that search engines get to take from everyone while giving nothing back. Google has been taking steps in this direction for years: its results page has been increasingly filled with ways of discouraging people from leaving its confines.
The Instrumentalist | Zadie Smith | The New York Review of Books
Whereas if you grew up online, the negative attributes of individual humans are immediately disqualifying. The very phrase ad hominem has been rendered obsolete, almost incomprehensible. An argument that is directed against a person, rather than the position they are maintaining? Online a person is the position they’re maintaining and vice versa. Opinions are identities and identities are opinions. Unfollow!
I’m the one severely triggered by statements like “Chaucer is misogynistic” or “Virginia Woolf was a racist.” Not because I can’t see that both statements are partially true, but because I am of that generation whose only real shibboleth was: “Is it interesting?” Into which broad category both evils and flaws could easily be fit, not because you agreed with them personally but because they had the potential to be analyzed, just like anything else
We are by now used to apocalyptic bad guys with the end of the world in mind, but it’s a long time since I went to the movies and saw an accurate representation of an ordinary sinner.
Spotting a hot young cellist, Olga, in the bathroom of her workplace, Tár later recognizes this same young woman’s shoes, peeking out from beneath those screens orchestra directors use to preserve the anonymity of “blind auditions.” Next thing we know Tár has given Olga a seat in her orchestra. Then decides to add Elgar’s Cello Concerto to the program, and to give that prestigious solo to the new girl instead of the first cello. And this move, in turn, allows her to organize a series of one-on-one rehearsals with Olga at that apartment she maintains in the city…There’s a word for this behavior: instrumentalism. Using people as tools. As means rather than ends in themselves. To satisfy your own desire, or your sense of your own power, or simply because you can
Every generation makes new rules. Every generation comes up against the persistent ethical failures of the human animal. But though there may be no permanent transformations in our emotional lives, there can be genuine reframings and new language and laws created to name and/or penalize the ways we tend to hurt each other, and this is a service each generation can perform for the one before.
Rebuilding Society on Meaning (Improved version) - YouTube
You Will Never Be A Full Stack Developer | Seldo.com
Every software framework you've ever used is in the abstraction game: it takes a general-purpose tool, picks a specific set of common use-cases, and puts up scaffolding and guard rails that make it easier to build those specific use cases by giving you less to do and fewer choices to think about.
The lines between these three are blurry. Popular abstractions become standardizations.
What the internet’s favourite Substack writers think will happen this…
Dirt: The indomitable human spirit
what stretches ahead is a banal ending that refuses to end: the slow violence of capitalism and climate change, a future of could-haves and should-haves void of capital-M meaning.
A core reason behind this genre’s popular success lies in the fact that it lacks the cloying, naïve quality so often associated with positivity. While this partially stems from the aesthetic and language employed—which, thanks to its poetic tenor, internet avant-garde style, and general high-low approach, reads as more online-experimental for those in-the-know and less cheesy iFunny reposts for Boomers—these would matter little if it weren’t for the honest realism that underpins this trend’s optimism
Unlike the deluded optimism espoused by politicians, technologists, and millionaires—which views progress as linear or believes that technology will save us from ourselves or thinks that watching celebrities sing will solve crisis—this trend, like the pessimists it responds to, recognizes that there seems to be no turning back from the precipice.
The genre positions the exercise and resilience of the “Indomitable Human Spirit” at the scale of a single life at the center of its philosophical optimism—not our ability to save the future, but rather our willingness to try and endure with grace.
Arc is the best web browser to come out in the last decade
Mary Gaitskill Has Come Online
The Garden and the Stream: A Technopastoraldigit
Connection, Creativity and Drama: Teen Life on Social Media in 2022
When asked how often they decide not to post on social media out of fear of it being using against them, older teen girls stand out. For example, half of 15- to 17-year-old girls say they often or sometimes decide not to post something on social media because they worry others might use it to embarrass them, compared with smaller shares of younger girls or boys.
While 9% of teens think social media has had a mostly negative effect on them personally, that share rises to 32% when the same question is framed about people their age.
this survey reveals that only a minority of teens say they have been civically active on social media in the past year via one of the three means asked about at the time of the survey. One-in-ten teens say they have encouraged others to take action on political or social issues that are important to them or have posted a picture to show their support for a political or social issue in the past 12 months.
larger shares of Democrats than Republicans say they have posted pictures or used hashtags to show support for a political or social issue in the past year. In total, Democratic teens are twice as likely as Republican teens to have engaged in any of these activities during this time (20% vs. 10%).
Not only do small shares of teens participate in these types of activities on social media, relatively few say these platforms play a critical role in how they interact with political and social issues.
18% of Democratic teens say social media is extremely or very important to them when it comes to exposing them to new points of view, compared with 8% of Republican teens.
Despite feeling a lack of control over their data being collected by social media companies, teens are largely unconcerned. A fifth of teens (20%) say they feel very or extremely concerned about the amount of their personal information social media companies might have. Still, a notable segment of teens – 44% – say they have little or no concern about how much these companies might know about them.
Why Do All Websites Look the Same?
Leaving Twitter's Walled Garden
The Perils of Audience Capture
While it may ostensibly appear to be a simple case of influencers making a business decision to create more of the content they believe audiences want, and then being incentivized by engagement numbers to remain in this niche forever, it's actually deeper than that. It involves the gradual and unwitting replacement of a person's identity with one custom-made for the audience.
Put simply, in order to be someone, we need someone to be someone for.
When influencers are analyzing audience feedback, they often find that their more outlandish behavior receives the most attention and approval, which leads them to recalibrate their personalities according to far more extreme social cues than those they'd receive in real life. In doing this they exaggerate the more idiosyncratic facets of their personalities, becoming crude caricatures of themselves.
As the caricature becomes more familiar than the person, both to the audience and to the influencer, it comes to be regarded by both as the only honest expression of the influencer, so that any deviation from it soon looks and feels inauthentic. At that point the persona has eclipsed the person, and the audience has captured the influencer.
he implied his firing was part of the conspiracy to silence the truth, and urged his loyal followers to subscribe to his Substack, as this was now his family’s only source of income. His new audience proved to be generous with both money and attention, and his need to meet their expectations seems to have spurred him, consciously or unconsciously, to double down on his more extreme views. Now almost everything he writes about, from Covid to Ukraine, he somehow ties to the shadowy New World Order.
Money really misaligns incentives
I wanted an audience, but I also knew that having the wrong audience would be worse than having no audience, because they'd constrain me with their expectations, forcing me to focus on one tiny niche of my worldview at the expense of everything else, until I became a parody of myself.
I ensured that my brand image—the person that my audience expects me to be—was in alignment with my ideal image—the person I want to be. So even though audience capture likely does affect me in some way, it only makes me more like the person I want to be.
Ideally.
This is the ultimate trapdoor in the hall of fame; to become a prisoner of one's own persona. The desire for recognition in an increasingly atomized world lures us to be who strangers wish us to be. And with personal development so arduous and lonely, there is ease and comfort in crowdsourcing your identity.
When social media controls the nuclear codes
David Foster Wallace once said that:The language of images. . . maybe not threatens, but completely changes actual lived life. When you consider that my grandparents, by the time they got married and kissed, I think they had probably seen maybe a hundred kisses. They'd seen people kiss a hundred times. My parents, who grew up with mainstream Hollywood cinema, had seen thousands of kisses by the time they ever kissed. Before I kissed anyone I had seen tens of thousands of kisses. I know that the first time I kissed much of my thought was, “Am I doing it right? Am I doing it according to how I've seen it?”
A lot of the 80s and 90s critiques of postmodernity did have a point—our experience really is colored by media. Having seen a hundred movies about nuclear apocalypse, the entire time we’ll be looking over our shoulder for the camera, thinking: “Am I doing it right?”
I Didn’t Want It to Be True, but the Medium Really Is the Message
it’s the common rules that govern all creation and consumption across a medium that change people and society. Oral culture teaches us to think one way, written culture another. Television turned everything into entertainment, and social media taught us to think with the crowd.
There is a grammar and logic to the medium, enforced by internal culture and by ratings reports broken down by the quarter-hour. You can do better cable news or worse cable news, but you are always doing cable news.
Don’t just look at the way things are being expressed; look at how the way things are expressed determines what’s actually expressible.” In other words, the medium blocks certain messages.
Television teaches us to expect that anything and everything should be entertaining. But not everything should be entertainment, and the expectation that it will be is a vast social and even ideological change.
Television, he writes, “serves us most ill when it co-opts serious modes of discourse — news, politics, science, education, commerce, religion — and turns them into entertainment packages.
The border between entertainment and everything else was blurring, and entertainers would be the only ones able to fulfill our expectations for politicians. He spends considerable time thinking, for instance, about the people who were viable politicians in a textual era and who would be locked out of politics because they couldn’t command the screen.
As a medium, Twitter nudges its users toward ideas that can survive without context, that can travel legibly in under 280 characters. It encourages a constant awareness of what everyone else is discussing. It makes the measure of conversational success not just how others react and respond but how much response there is. It, too, is a mold, and it has acted with particular force on some of our most powerful industries — media and politics and technology.
I’ve also learned that patterns of attention — what we choose to notice and what we do not — are how we render reality for ourselves, and thus have a direct bearing on what we feel is possible at any given time. These aspects, taken together, suggest to me the revolutionary potential of taking back our attention.
The Case for Digital Public Infrastructure
‘Silicon Values’
York points to a 1946 U.S. Supreme Court decision, Marsh v. Alabama, which held that private entities can become sufficiently large and public to require them to be subject to the same Constitutional constraints as government entities. Though York says this ruling has “not as of this writing been applied to the quasi-public spaces of the internet”
even if YouTube were treated as an extension of government due to its size and required to retain every non-criminal video uploaded to its service, it would make as much of a political statement elsewhere, if not more. In France and Germany, it — like any other company — must comply with laws that require the removal of hate speech, laws which in the U.S. would be unconstitutional
Several European countries have banned Google Analytics because it is impossible for their citizens to be protected against surveillance by American intelligence agencies.
TikTok has downplayed the seriousness of its platform by framing it as an entertainment venue. As with other platforms, disinformation on TikTok spreads and multiplies. These factors may have an effect on how people vote. But the sudden alarm over yet-unproved allegations of algorithmic meddling in TikTok to boost Chinese interests is laughable to those of us who have been at the mercy of American-created algorithms despite living elsewhere. American state actors have also taken advantage of the popularity of social networks in ways not dissimilar from political adversaries.
what York notes is how aligned platforms are with the biases of upper-class white Americans; not coincidentally, the boards and executive teams of these companies are dominated by people matching that description.
It should not be so easy to point to similarities in egregious behaviour; corruption of legal processes should not be so common. I worry that regulators in China and the U.S. will spend so much time negotiating which of them gets to treat the internet as their domain while the rest of us get steamrolled by policies that maximize their self-preferencing.
to ensure a clear set of values projected into the world. One way to achieve that is to prefer protocols over platforms.
This links up with Ben Thompson’s idea about splitting twitter into a protocol company and a social media company
Yes, the country’s light touch approach to regulation and generous support of its tech industry has brought the world many of its most popular products and services. But it should not be assumed that we must rely on these companies built in the context of middle- and upper-class America.
How to Put Out Democracy’s Dumpster Fire
Tocqueville reckoned that the true success of democracy in America rested not on the grand ideals expressed on public monuments or even in the language of the Constitution, but in these habits and practices.
To Thrive, Our Democracy Needs Digital Public Infrastructure
Facebook, Twitter and YouTube each took first steps to rein in the worst behavior on their platforms in the heat of the election, but none have confronted how their spaces were structured to become ideal venues for outrage and incitement.
The first step in the process is realizing that the problems we’re experiencing in digital life — how to gather strangers together in public in ways that make it so people generally behave themselves — aren’t new. They’re problems that physical communities have wrestled with for centuries. In physical communities, businesses play a critical role — but so do public libraries, schools, parks and roads. These spaces are often the groundwork that private industry builds itself around: Schools teach and train the next generation of workers; new public parks and plazas often spur private real estate development; businesses transport goods on publicly funded roads; and so on. Public spaces and private industry work symbiotically, if sometimes imperfectly.
These kinds of public spaces mostly don’t exist online. Twitter, Facebook, YouTube and Twitch each offer some aspects of these experiences. But ultimately, they’re all organized around the need for growth and revenue — incentives which are in tension with the critical community functions these institutions also serve, and with the heavy staffing models they require.
Recent peer-reviewed research from three professors at the University of Virginia demonstrates how dramatically the design of platforms can affect how people behave on them. In their study, in months where conservative-leaning users visited Facebook more, they saw much more ideological content than normal, whereas in months where they visited Reddit more they “read news that was 50 percent more moderate than what they typically read.” (This effect was smaller but similar for political liberals). Same people, different platforms, and dramatically different news diets as a result.
Wikipedia is probably the best-known example of this kind of institution — a nonprofit, mission-driven piece of digital infrastructure. The nonprofit Internet Archive, which bills itself as a free “digital library,” a repository of books, movies and music and over 500 billion archived webpages to create a living history of the internet, is another. But what we need are not just information services with a mission-driven agenda, but spaces where people can talk, share and relate without those relationships being distorted and shaped by profit-seeking incentive structures.
Users can post only once a day, every post is read by a moderating team, and if you’re too salty or run afoul of other norms, you’re encouraged to rewrite. This is terrible for short-term engagement — flame wars drive attention and use, after all — and as a business model, all those moderators are costly. But there’s a long-term payoff: two-thirds of Vermont households are on the Forum, and many Vermonters find it a valuable place for thoughtful public discussions.
In fact, public digital infrastructures might be the right place to start exploring how to reinvent governance and civil society more broadly.
If mission, design and governance are important ingredients, the final component is what might be called digital essential workers — professionals like librarians whose job is to manage, steward, and care for the people in these spaces. This care work is one of the pillars of successful physical communities, which has been abstracted away by the existing tech platforms. S
The truth is that Facebook, Google and Twitter have displaced and sucked the revenue out of an entire ecosystem of local journalistic enterprises and other institutions that served some of these public functions.
How Facebook got addicted to spreading misinformation
Instagram, TikTok, and the Three Trends
In other words, when Kylie Jenner posts a petition demanding that Meta “Make Instagram Instagram again”, the honest answer is that changing Instagram is the most Instagram-like behavior possible.
The first trend is the shift towards ever more immersive mediums. Facebook, for example, started with text but exploded with the addition of photos. Instagram started with photos and expanded into video. Gaming was the first to make this progression, and is well into the 3D era. The next step is full immersion — virtual reality — and while the format has yet to penetrate the mainstream this progression in mediums is perhaps the most obvious reason to be bullish about the possibility.
The second trend is the increase in artificial intelligence. I’m using the term colloquially to refer to the overall trend of computers getting smarter and more useful, even if those smarts are a function of simple algorithms, machine learning, or, perhaps someday, something approaching general intelligence.
The third trend is the change in interaction models from user-directed to computer-controlled. The first version of Facebook relied on users clicking on links to visit different profiles; the News Feed changed the interaction model to scrolling. Stories reduced that to tapping, and Reels/TikTok is about swiping. YouTube has gone further than anyone here: Autoplay simply plays the next video without any interaction required at all.
The Aesthetics of Apology - Why So Many Brands Are Getting it Wrong
in Instagram apologies, even when someone ostensibly confronts their ugliness, it’s hard to read the gesture as anything but an effort to publicly reclaim their image. But at least the Notes App Apology permitted us a semblance of sincerity, and suggested there might be a human being who typed the message—even if that human was an intern or assistant. There’s nothing sincere about a trickle-down excuse crafted to look pretty for Instagram grids, and the processed nature of Photoshopped Apologies implies the absence of the one thing all genuine apologies must possess: accountability straight from the person who committed the transgression.
Digital gardens | Chase McCoy
On the Social Media Ideology
Social networking is much more than just a dominant discourse. We need to go beyond text and images and include its software, interfaces, and networks that depend on a technical infrastructure consisting of offices and their consultants and cleaners, cables and data centers, working in close concert with the movements and habits of the connected billions. Academic internet studies circles have shifted their attention from utopian promises, impulses, and critiques to “mapping” the network’s impact. From digital humanities to data science we see a shift in network-oriented inquiry from Whether and Why, What and Who, to (merely) How. From a sociality of causes to a sociality of net effects. A new generation of humanistic researchers is lured into the “big data” trap, and kept busy capturing user behavior whilst producing seductive eye candy for an image-hungry audience (and vice versa).
We need to politicize the New Electricity, the privately owned utilities of our century, before they disappear into the background.
What remains particularly unexplained is the apparent paradox between the hyper-individualized subject and the herd mentality of the social.
Before we enter the social media sphere, everyone first fills out a profile and choses a username and password in order to create an account. Minutes later, you’re part of the game and you start sharing, creating, playing, as if it has always been like that. The profile is the a priori part and the profiling and targeted advertising cannot operate without it. The platforms present themselves as self-evident. They just are—facilitating our feature-rich lives. Everyone that counts is there. It is through the gate of the profile that we become its subject.
We pull in updates, 24/7, in a real-time global economy of interdependencies, having been taught to read news feeds as interpersonal indicators of the planetary condition
Treating social media as ideology means observing how it binds together media, culture, and identity into an ever-growing cultural performance (and related “cultural studies”) of gender, lifestyle, fashion, brands, celebrity, and news from radio, television, magazines, and the web—all of this imbricated with the entrepreneurial values of venture capital and start-up culture, with their underside of declining livelihoods and growing inequality.
Software, or perhaps more precisely operating systems, offer us an imaginary relationship to our hardware: they do not represent transistors but rather desktops and recycling bins. Software produces users. Without operating system (OS) there would be no access to hardware; without OS no actions, no practices, and thus no user. Each OS, through its advertisements, interpellates a “user”: calls it and offers it a name or image with which to identify.
We could say that social media performs the same function, and is even more powerful.
In the age of social media we seem to confess less what we think. It’s considered too risky, too private. We share what we do, and see, in a staged manner. Yes, we share judgments and opinions, but no thoughts. Our Self is too busy for that, always on the move, flexible, open, sporty, sexy, and always ready to connect and express.
Platforms are not stages; they bring together and synthesize (multimedia) data, yes, but what is lacking here is the (curatorial) element of human labor. That’s why there is no media in social media. The platforms operate because of their software, automated procedures, algorithms, and filters, not because of their large staff of editors and designers. Their lack of employees is what makes current debates in terms of racism, anti-Semitism, and jihadism so timely, as social media platforms are currently forced by politicians to employ editors who will have to do the all-too-human monitoring work (filtering out ancient ideologies that refuse to disappear).
One startup's quest to take on Chrome and reinvent the web browser
Miller is the CEO of a new startup called The Browser Company, and he wants to change the way people think about browsers altogether. He sees browsers as operating systems, and likes to wonder aloud what "iOS for the web" might look like. What if your browser could build you a personalized news feed because it knows the sites you go to? What if every web app felt like a native app, and the browser itself was just the app launcher? What if you could drag a file from one tab to another, and it just worked? What if the web browser was a shareable, synced, multiplayer experience?
Miller became convinced that the next big platform was right in front of his face: the open web. The underlying infrastructure worked, the apps were great, there were no tech giants in the way imposing rules and extracting huge commissions. The only thing missing was a tool to bring it all together in a user-friendly way, and make the web more than the sum of its parts.
Browser's team instead spent its time thinking about how to solve things like tab overload, that all-too-familiar feeling of not being able to find anything in a sea of tiny icons at the top of the screen.That's something Nate Parrott, a designer on the team, had been thinking about for a long time. "Before I met Josh," he said, "I had this fascination with browsers, because it's the window through which you experience so much of the web, and yet it feels like no one is working on web browsers." Outside of his day job at Snap, he was also building a web browser with some new interaction ideas. "A big one for me was that I wanted to get rid of the distinction between open and closed tabs," he said. "I wanted to encourage tab-hoarding behavior, where you can open as many tabs as you want and organize them so you're not constantly overwhelmed seeing them all at the same time."
One of Arc's most immediately noticeable features is that it combines bookmarks and tabs. Clicking an icon in the sidebar opens the app, just like on iOS or Android. When users navigate somewhere else, they don't have to close the tab; it just waits in the background until it's needed again, and Arc manages its background performance so it doesn't use too much memory. Instead of opening Gmail in a tab, users just … open Gmail.
Everyone at The Browser Company swears there's no Master Plan, or much of a roadmap. What they have is a lot of ideas, a base on which they can develop really quickly, and a deep affinity for prototypes. "You can't just think really hard and design the best web browser," Parrott said. "You have to feel it and put it in front of people and get them to react to it."
The Browser Company could become an R&D shop, full of interesting ideas but unable to build a browser that anyone actually uses. The company does have plenty of runway: It recently raised more than $13 million in funding from investors including Jeff Weiner, Eric Yuan, Patrick Collison, Fidji Simo and a number of other people with long experience building for the internet, that values The Browser Company at $100 million. Still, Agrawal said, "We're paranoid that we could end up in this world of just having a Bell Labs kind of situation, where you have a lot of interesting stuff, but it's not monetizable, it's not sticky, any of that." That's why they're religious about talking to users all the time, getting feedback on everything, making sure that the stuff they're building is genuinely useful. And when it's not, they pivot fast.
A clean start for the web
Building your own website is cool again, and it's changing the whole internet
On the Internet, We’re Always Famous - The New Yorker
I’ve come to believe that, in the Internet age, the psychologically destabilizing experience of fame is coming for everyone. Everyone is losing their minds online because the combination of mass fame and mass surveillance increasingly channels our most basic impulses—toward loving and being loved, caring for and being cared for, getting the people we know to laugh at our jokes—into the project of impressing strangers, a project that cannot, by definition, sate our desires but feels close enough to real human connection that we cannot but pursue it in ever more compulsive ways.
It seems distant now, but once upon a time the Internet was going to save us from the menace of TV. Since the late fifties, TV has had a special role, both as the country’s dominant medium, in audience and influence, and as a bête noire for a certain strain of American intellectuals, who view it as the root of all evil. In “Amusing Ourselves to Death,” from 1985, Neil Postman argues that, for its first hundred and fifty years, the U.S. was a culture of readers and writers, and that the print medium—in the form of pamphlets, broadsheets, newspapers, and written speeches and sermons—structured not only public discourse but also modes of thought and the institutions of democracy itself. According to Postman, TV destroyed all that, replacing our written culture with a culture of images that was, in a very literal sense, meaningless. “Americans no longer talk to each other, they entertain each other,” he writes. “They do not exchange ideas; they exchange images. They do not argue with propositions; they argue with good looks, celebrities and commercials.”
Wikipedia Is the Last Best Place on the Internet | WIRED