Saved

Saved

3652 bookmarks
Newest
Blessed and emoji-pilled: why language online is so absurd
Blessed and emoji-pilled: why language online is so absurd
AI: This article explores the evolution of online language and communication, highlighting the increasing absurdity and surrealism in digital discourse. It discusses how traditional language is being replaced by memes, emojis, and seemingly nonsensical phrases, reflecting the influence of social media platforms and algorithms on our communication styles. The piece examines the implications of this shift, touching on themes of information overload, AI-like speech patterns, and the potential consequences of this new form of digital dialect.
Layers upon layers of references are stacked together in a single post, while the posts themselves fly by faster than ever in our feeds. To someone who isn’t “chronically online” a few dislocated images or words may trigger a flash of recognition – a member of the royal family, a beloved cartoon character – but their relationship with each other is impossible to unpick. Add the absurdist language of online culture and the impenetrable algorithms that decide what we see in our feeds, and it seems like all hope is lost when it comes to making sense of the internet.
Forget words! Don’t think! In today’s digitally-mediated landscape, there’s no need for knowledge or understanding, just information. Scroll the feed and you’ll find countless video clips and posts advocating this smooth-brained agenda: lobotomy chic, sludge content, silly girl summer.
“With memes, images are converging more on the linguistic, becoming flattened into something more like symbols/hieroglyphs/words,” says writer Olivia Kan-Sperling, who specialises in programming language critique. For the meme-fluent, the form isn’t important, but rather the message it carries. “A meme is lower-resolution in terms of its aesthetic affordances than a normal pic because you barely have to look at it to know what it’s ‘doing’,” she expands. “For the literate, its full meaning unfolds at a glance.” To understand this way of “speaking writing posting” means we must embrace the malleability of language, the ambiguities and interpretations – and free it from ‘real-world’ rules.
Hey guys, I just got an order in from Sephora – here’s everything that I got. Get ready with me for a boat day in Miami. Come and spend the day with me – starting off with coffee. TikTok influencers engage in a high-pitched and breathless way of speaking that over-emphasises keywords in a youthful, singsong cadence. For the Attention Economy, it’s the sort of algorithm-friendly repetition that’s quantified by clicks and likes, monetised by engagement for short attention spans. “Now, we have to speak machine with machines that were trained on humans,” says Basar, who refers to this algorithm-led style as promptcore.
As algorithms digest our online behaviour into data, we resemble a swarm, a hivemind. We are beginning to think and speak like machines, in UI-friendly keywords and emoji-pilled phrases.
·dazeddigital.com·
Blessed and emoji-pilled: why language online is so absurd
The secret digital behaviors of Gen Z
The secret digital behaviors of Gen Z

shift from traditional notions of information literacy to "information sensibility" among Gen Zers, who prioritize social signals and peer influence over fact-checking. The research by Jigsaw, a Google subsidiary, reveals that Gen Zers spend their digital lives in "timepass" mode, engaging with light content and trusting influencers over traditional news sources.

Comment sections for social validation and information signaling

·businessinsider.com·
The secret digital behaviors of Gen Z
WWDC 2024: Apple Intelligence
WWDC 2024: Apple Intelligence
their models are almost entirely based on personal context, by way of an on-device semantic index. In broad strokes, this on-device semantic index can be thought of as a next-generation Spotlight. Apple is focusing on what it can do that no one else can on Apple devices, and not really even trying to compete against ChatGPT et al. for world-knowledge context. They’re focusing on unique differentiation, and eschewing commoditization.
Apple is doing what no one else can do: integrating generative AI into the frameworks in iOS and MacOS used by developers to create native apps. Apps built on the system APIs and frameworks will gain generative AI features for free, both in the sense that the features come automatically when the app is running on a device that meets the minimum specs to qualify for Apple Intelligence, and in the sense that Apple isn’t charging developers or users to utilize these features.
·daringfireball.net·
WWDC 2024: Apple Intelligence
Apple intelligence and AI maximalism — Benedict Evans
Apple intelligence and AI maximalism — Benedict Evans
The chatbot might replace all software with a prompt - ‘software is dead’. I’m skeptical about this, as I’ve written here, but Apple is proposing the opposite: that generative AI is a technology, not a product.
Apple is, I think, signalling a view that generative AI, and ChatGPT itself, is a commodity technology that is most useful when it is: Embedded in a system that gives it broader context about the user (which might be search, social, a device OS, or a vertical application) and Unbundled into individual features (ditto), which are inherently easier to run as small power-efficient models on small power-efficient devices on the edge (paid for by users, not your capex budget) - which is just as well, because… This stuff will never work for the mass-market if we have marginal cost every time the user presses ‘OK’ and we need a fleet of new nuclear power-stations to run it all.
Apple has built its own foundation models, which (on the benchmarks it published) are comparable to anything else on the market, but there’s nowhere that you can plug a raw prompt directly into the model and get a raw output back - there are always sets of buttons and options shaping what you ask, and that’s presented to the user in different ways for different features. In most of these features, there’s no visible bot at all. You don’t ask a question and get a response: instead, your emails are prioritised, or you press ‘summarise’ and a summary appears. You can type a request into Siri (and Siri itself is only one of the many features using Apple’s models), but even then you don’t get raw model output back: you get GUI. The LLM is abstracted away as an API call.
Apple is treating this as a technology to enable new classes of features and capabilities, where there is design and product management shaping what the technology does and what the user sees, not as an oracle that you ask for things.
Apple is drawing a split between a ‘context model’ and a ‘world model’. Apple’s models have access to all the context that your phone has about you, powering those features, and this is all private, both on device and in Apple’s ‘Private Cloud’. But if you ask for ideas for what to make with a photo of your grocery shopping, then this is no longer about your context, and Apple will offer to send that to a third-party world model - today, ChatGPT.
that’s clearly separated into a different experience where you should have different expectations, and it’s also, of course, OpenAI’s brand risk, not Apple’s. Meanwhile, that world model gets none of your context, only your one-off prompt.
Neither OpenAI nor any of the other cloud models from new companies (Anthropic, Mistral etc) have your emails, messages, locations, photos, files and so on.
Apple is letting OpenAI take the brand risk of creating pizza glue recipes, and making error rates and abuse someone else’s problem, while Apple watches from a safe distance.
The next step, probably, is to take bids from Bing and Google for the default slot, but meanwhile, more and more use-cases will be quietly shifted from the third party to Apple’s own models. It’s Apple’s own software that decides where the queries go, after all, and which ones need the third party at all.
A lot of the compute to run Apple Intelligence is in end-user devices paid for by the users, not Apple’s capex budget, and Apple Intelligence is free.
Commoditisation is often also integration. There was a time when ‘spell check’ was a separate product that you had to buy, for hundreds of dollars, and there were dozens of competing products on the market, but over time it was integrated first into the word processor and then the OS. The same thing happened with the last wave of machine learning - style transfer or image recognition were products for five minutes and then became features. Today ‘summarise this document’ is AI, and you need a cloud LLM that costs $20/month, but tomorrow the OS will do that for free. ‘AI is whatever doesn’t work yet.’
Apple is big enough to take its own path, just as it did moving the Mac to its own silicon: it controls the software and APIs on top of the silicon that are the basis of those developer network effects, and it has a world class chip team and privileged access to TSMC.
Apple is doing something slightly different - it’s proposing a single context model for everything you do on your phone, and powering features from that, rather than adding disconnected LLM-powered features at disconnected points across the company.
·ben-evans.com·
Apple intelligence and AI maximalism — Benedict Evans
written in the body
written in the body
I spent so many years of my life trying to live mostly in my head. Intellectualizing everything made me feel like it was manageable. I was always trying to manage my own reactions and the reactions of everyone else around me. Learning how to manage people was the skill that I had been lavishly rewarded for in my childhood and teens. Growing up, you’re being reprimanded in a million different ways all the time, and I learned to modify my behavior so that over time I got more and more positive feedback. People like it when you do X and not Y, say X and not Y. I kept track of all of it in my head and not in my body. Intellectualizing kept me numbed out, and for a long time what I wanted was nothing more than to be numbed out, because when things hurt they hurt less. Whatever I felt like I couldn’t show people or tell people I hid away. I compartmentalized, and what I put in the compartment I never looked at became my shadow.
So much of what I care about can be boiled down to this: when you’re able to really inhabit and pay attention to your body, it becomes obvious what you want and don’t want, and the path towards your desires is clear. If you’re not in your body, you constantly rationalizing what you should do next, and that can leave you inert or trapped or simply choosing the wrong thing over and over. "I know I should, but I can’t do it” is often another way of saying “I’ve reached this conclusion intellectually, but I’m so frozen out of my body I can’t feel a deeper certainty.”
It was so incredibly hard when people gave me negative feedback—withdrew, or rejected me, or were just preoccupied with their own problems—because I relied on other people to figure out whether everything was alright.
When I started living in my body I started feeling for the first time that I could trust myself in a way that extended beyond trust of my intelligence, of my ability to pick up on cues in my external environment.
I can keep my attention outwards, I don’t direct it inwards in a self-conscious way. It’s the difference between noticing whether someone seems to having a good time in the moment by watching their face vs agonizing about whether they enjoyed something after the fact. I can tell the difference between when I’m tired because I didn’t sleep well versus tired because I’m bored versus tired because I’m avoiding something. When I’m in my body, I’m aware of myself instead of obsessing over my state, and this allows me to have more room for other people.
·avabear.xyz·
written in the body
Richard Linklater Sees the Killer Inside Us All
Richard Linklater Sees the Killer Inside Us All
What’s your relationship now to the work back then? Are you as passionate? I really had to think about that. My analysis of that is, you’re a different person with different needs. A lot of that is based on confidence. When you’re starting out in an art form or anything in life, you can’t have confidence because you don’t have experience, and you can only get confidence through experience. But you have to be pretty confident to make a film. So the only way you counterbalance that lack of experience and confidence is absolute passion, fanatical spirit. And I’ve had this conversation over the years with filmmaker friends: Am I as passionate as I was in my 20s? Would I risk my whole life? If it was my best friend or my negative drowning, which do I save? The 20-something self goes, I’m saving my film! Now it’s not that answer. I’m not ashamed to say that, because all that passion doesn’t go away. It disperses a little healthfully. I’m passionate about more things in the world. I care about more things, and that serves me. The most fascinating relationship we all have is to ourselves at different times in our lives. You look back, and it’s like, I’m not as passionate as I was at 25. Thank God. That person was very insecure, very unkind. You’re better than that now. Hopefully.
·nytimes.com·
Richard Linklater Sees the Killer Inside Us All
How to read a movie - Roger Ebert
How to read a movie - Roger Ebert
When the Sun-Times appointed me film critic, I hadn't taken a single film course (the University of Illinois didn't offer them in those days). One of the reasons I started teaching was to teach myself. Look at a couple dozen New Wave films, you know more about the New Wave. Same with silent films, documentaries, specific directors.
visual compositions have "intrinsic weighting." By that I believe he means that certain areas of the available visual space have tendencies to stir emotional or aesthetic reactions. These are not "laws." To "violate" them can be as meaningful as to "follow" them. I have never heard of a director or cinematographer who ever consciously applied them.
I suspect that filmmakers compose shots from images that well up emotionally, instinctively or strategically, just as a good pianist never thinks about the notes.
I already knew about the painter's "Golden Mean," or the larger concept of the "golden ratio." For a complete explanation, see Wiki, and also look up the "Rule of Thirds." To reduce the concept to a crude rule of thumb in the composition of a shot in a movie: A person located somewhat to the right of center will seem ideally placed. A person to the right of that position will seem more positive; to the left, more negative. A centered person will seem objectified, like a mug shot. I call that position somewhat to the right of center the "strong axis."
They are not absolutes. But in general terms, in a two-shot, the person on the right will "seem" dominant over the person on the left
In simplistic terms: Right is more positive, left more negative. Movement to the right seems more favorable; to the left, less so. The future seems to live on the right, the past on the left. The top is dominant over the bottom. The foreground is stronger than the background. Symmetrical compositions seem at rest. Diagonals in a composition seem to "move" in the direction of the sharpest angle they form, even though of course they may not move at all. Therefore, a composition could lead us into a background that becomes dominant over a foreground.
Of course I should employ quotation marks every time I write such words as positive, negative, stronger, weaker, stable, past, future, dominant or submissive. All of these are tendencies, not absolutes, and as I said, can work as well by being violated as by being followed. Think of "intrinsic weighting" as a process that gives all areas of the screen complete freedom, but acts like an invisible rubber band to create tension or attention when stretched. Never make the mistake of thinking of these things as absolutes. They exist in the realm of emotional tendencies. Often use the cautionary phrase, "all things being equal" -- which of course they never are.
·rogerebert.com·
How to read a movie - Roger Ebert
The Difference Between a Framework and a Library
The Difference Between a Framework and a Library
A library is like going to Ikea. You already have a home, but you need a bit of help with furniture. You don’t feel like making your own table from scratch. Ikea allows you to pick and choose different things to go in your home. You are in control. A framework, on the other hand, is like building a model home. You have a set of blueprints and a few limited choices when it comes to architecture and design. Ultimately, the contractor and blueprint are in control. And they will let you know when and where you can provide your input.
·freecodecamp.org·
The Difference Between a Framework and a Library
The human cost of an Apple update
The human cost of an Apple update
Dating apps don’t work, and meeting people in person seems foreign, even impossible. But it was dating apps that drove IRL connections nearly extinct. In other words, dating apps did work, for almost a decade, by promising to cut out all the things about in-person dating that made us feel vulnerable or uncomfortable. Rejection now happens with a swipe, out of sight, with neither party the wiser. If you match and then change your mind, you can just unmatch without explanation.
This arc plays out across all kinds of apps, and all kinds of human relationships, as tech companies seek to find and solve every type of “friction” and discomfort. But those efforts are rooted in the mistaken idea that being a person shouldn’t come with difficult emotions—that we aren’t often, in fact, served by hard conversations or uncomfortable feelings.
·embedded.substack.com·
The human cost of an Apple update
Useful and Overlooked Skills
Useful and Overlooked Skills
A diplomatic “no” is when you’re clear about your feelings but empathetic to how the person on the receiving end might interpret those feelings.
·collabfund.com·
Useful and Overlooked Skills
What Is Going On With Next-Generation Apple CarPlay?
What Is Going On With Next-Generation Apple CarPlay?
I’d posit that a reason why people love CarPlay so much is because the media, communication, and navigation experiences have traditionally been pretty poor. CarPlay supplants those, and it does so with aplomb because people use those same media, communication, and navigation features that are personalized to them with their phones when they’re not in their cars.
No one is walking around with a speedometer and a tachometer on their iPhone that need to have a familiar look and feel, rendered exclusively in San Francisco. As long as automakers supply the existing level of CarPlay support, which isn’t a given, then customers like us would be content with the status quo, or even a slight improvement.
In my humble opinion, Next-Gen CarPlay is dead on arrival. Too late, too complicated, and it doesn’t solve the needs of automakers or customers. Instead of letting the vehicle’s interface peak through, Apple should consider letting CarPlay peak through for the non-critical systems people prefer to use with CarPlay.
Design a CarPlay that can output multiple display streams (which Apple already over-designed) and display that in the cluster. Integrate with the existing controls for managing the interfaces in the vehicle. When the phone isn’t there, the vehicle will still be the same vehicle. When the phone is there, it’s got Apple Maps right in the cluster how you like it without changing the gauges, or the climate controls, or where the seat massage button is.
The everyday irritations people have are mundane, practical, and are not related to how Apple-like their car displays can look.
·joe-steel.com·
What Is Going On With Next-Generation Apple CarPlay?