Substrate

#targeting
#67: A Bicycle for the Donkey Mind
#67: A Bicycle for the Donkey Mind
Now we walk in straight lines because we have to, not because we know where we're going. Far from an expression of certainty, the urban street grid simplifies, removes choices, and reflect's nobody's direct route exactly. ​ We eagerly provide data about ourselves to platforms so they can help us learn what we want; our unique personal desires are mere inputs for systems that channel them into a narrower range of outputs.
·kneelingbus.substack.com·
#67: A Bicycle for the Donkey Mind
Self Checkout
Self Checkout
In other words, the ‘success’ of technologies like self-checkout machines is in large part produced by the human effort necessary to maintain the technologies ​ Their broader point is that automation doesn’t eliminate human labor; it often leads to its disguising and devaluation ​ Customers have to bridge the gap between the norms of human customer service and the company’s imposition of inhuman staff shortfalls without completely losing their patience or simply taking what they couldn’t find a reasonable way to purchase. ​ That is, they rely on the kinds of consideration from users that the companies themselves don’t practice. ​ They have prioritized user growth, data collection, and advertising over providing a safe, reliable service to users. It’s no wonder that some users will adopt the same attitude and the same goals: try to spread unwanted messages as far as possible.
·tinyletter.com·
Self Checkout
Big Mood Machine
Big Mood Machine
Indeed, what Spotify calls “streaming intelligence” should be understood as surveillance of its users to fuel its own growth and ability to sell mood-and-moment data to brands. ​ When a platform like Spotify sells advertisers on its mood-boosting, background experience, and then bakes these aims into what it recommends to listeners, a twisted form of behavior manipulation is at play. It’s connected to what Shoshana Zuboff, in The Age of Surveillance Capitalism: The Fight for A Human Future at the New Frontier of Power, calls the “behavioral futures market”—where “many companies are eager to lay their bets on our future behavior.”
·thebaffler.com·
Big Mood Machine
The Sweetgreen-ification of Society
The Sweetgreen-ification of Society
Fast forward to 2019. My lunch routine is a rotating cast of fast casual concepts, with lost vowel names ​ When I do go to a nearby deli, it’s impossible to ignore just how stark the socioeconomic contrast is to the Sweetgreen line. While the latter appears filled with people who stepped away from their WeWork desks, the former feels packed with the contractors underpaid to maintain that same WeWork. ​ We're no longer constrained to the Banana Republic-Gap-Old Navy trichotomy. Every facet of our daily consumer lives can now be hyper-segmented. ​ It's yet another area where technological know-how amplifies existing behaviors and practices. We've always signaled status with things like the little horse on your shirt or the expensive watch on your wrist (can you tell I worked in finance?) or the bag you carry or the shoes you wear. Those were social signaling table stakes. But now it's our lunch too. ​ Just next time you get lunch, take a good look around you. We are losing the spaces we share across socioeconomic strata. Slowly, but surely, we are building the means for an everyday urbanite to exist solely in their physical and digital class lanes. It used to be the rich, and then everyone else. Now in every realm of daily consumer life, we are able to efficiently separate ourselves into a publicly visible delineation of who belongs where. ​ But like in so many other areas of consumer life, we're slowly learning that mutually beneficial success at the micro-level just might have adverse effects in the macro.
·themargins.substack.com·
The Sweetgreen-ification of Society
Understanding Makes the Mind Lazy
Understanding Makes the Mind Lazy
platforms have to act as though their algorithms work and don’t work at the same time, and this equivocation fosters a paranoia about how algorithms work. The point of advertising, after all, is not to nail down what people are, as if that were static; it’s to shift currents of demand, to alter behavior patterns. But the logic of data profiling uses the past to repeat it as the future. This mystification is not an unfortunate side effect; it’s the value Facebook adds. Users are isolated from each other so they can feel as though they are the implied subject of all the discourse they experience on the site — so that they can be targeted in “one-to-one brand building” campaigns. Users get to feel important, singled out, worthy of decoding, and at the same time they get to interpret whatever they read through the lens of “Why did the algorithm choose this for me? What does this say about me and my tastes?” But that works only through an effort of disavowal: You have to feel that the algorithm is right enough to cater to you but not powerful enough to control you (even while it controls all those “indoctrinated peers”). In this London Review of Books essay about Brexit,William Davies offers this description of accelerated finance: The mentality of the high-frequency trader or hedge fund manager is wholly focused on leaving on better terms than one arrived, with minimum delay or friction in between. To the speculator, falling prices present just as lucrative an opportunity as rising prices (given the practice of ‘shorting’ financial assets), meaning that instability in general is attractive. As long as nothing ever stays the same, you can exit on better terms than you entered. The only unprofitable scenario is stasis. In a sense, platform paranoia is akin to market volatility; it reflects and promotes a high-frequency trading of sorts in various propositions, accelerating cycles of belief and skepticism as we churn through a much higher volume of information. Advertising is more likely to be effective amid these conditions, where it seems that everybody and not just marketers is being manipulative and deceptive. How we are targeted is always incomplete and inaccurate, but these inaccuracies in themselves can still drive and reshape behavior. Being targeted itself affects the targets, regardless of what is targeted at them, or if anything hits. They want to sell control over that connection, the moment at which your feelings become actions in the world. (Advertisers understand that link between feeling and acting entirely as a matter of “conversion rates” — when you actually buy something.) When we remember our lives authentically, we ask a fundamental question: Why did I remember this thing, at this moment? The “Why now?” question gives memory its meaning. Facebook randomizes and decontextualizes memory and detaches it from our current self. And why would I want to know what I looked like 10 years ago?
·tinyletter.com·
Understanding Makes the Mind Lazy
On FB's shadow versus real version of yourself
On FB's shadow versus real version of yourself
the shadow version of you that Facebook creates is its property; it's what's targeted etc.; meanwhile that entity is used against you (it's used to determine what you're qualified to see), which intensifies pressure on us to adopt that as our "real self— Rob Horning (@robhorning) January 25, 2019
·twitter.com·
On FB's shadow versus real version of yourself