Saved

Saved

3450 bookmarks
Newest
Greenwashing Certified™
Greenwashing Certified™
The problem is that the main incentive for pursuing corporate sustainability work is its ability to create more marketable products. Outside of sustainability efforts that are pursued based on compliance or regulation, much of this work is built around the idea of pandering to consumers' understanding of sustainability, something that’s notoriously variable.
·garden3d.substack.com·
Greenwashing Certified™
The Aesthetics of Apology - Why So Many Brands Are Getting it Wrong
The Aesthetics of Apology - Why So Many Brands Are Getting it Wrong
in Instagram apologies, even when someone ostensibly confronts their ugliness, it’s hard to read the gesture as anything but an effort to publicly reclaim their image. But at least the Notes App Apology permitted us a semblance of sincerity, and suggested there might be a human being who typed the message—even if that human was an intern or assistant. There’s nothing sincere about a trickle-down excuse crafted to look pretty for Instagram grids, and the processed nature of Photoshopped Apologies implies the absence of the one thing all genuine apologies must possess: accountability straight from the person who committed the transgression.
·artnews.com·
The Aesthetics of Apology - Why So Many Brands Are Getting it Wrong
Have iPhone Cameras Become Too Smart? | The New Yorker
Have iPhone Cameras Become Too Smart? | The New Yorker
iPhones are no longer cameras in the traditional sense. Instead, they are devices at the vanguard of “computational photography,” a term that describes imagery formed from digital data and processing as much as from optical information. Each picture registered by the lens is altered to bring it closer to a pre-programmed ideal. Gregory Gentert, a friend who is a fine-art photographer in Brooklyn, told me, “I’ve tried to photograph on the iPhone when light gets bluish around the end of the day, but the iPhone will try to correct that sort of thing.”
·newyorker.com·
Have iPhone Cameras Become Too Smart? | The New Yorker
Privacy, ads and confusion — Benedict Evans
Privacy, ads and confusion — Benedict Evans
Advertisers don’t really want to know who you are - they want to show diaper ads to people who have babies, not to show them to people who don’t, and to have some sense of which ads drove half a million sales and which ads drove a million sales.
In practice, ‘showing car ads to people who read about cars’ led the adtech industry to build vast piles of semi-random personal data, aggregated, disaggregated, traded, passed around and sometimes just lost, partly because it could and partly because that appeared to be the only way to do it. After half a decade of backlash, there are now a bunch of projects trying to get to the same underlying advertiser aims - to show ads that are relevant, and get some measure of ad effectiveness - while keeping the private data private.
Apple has pursued a very clear theory that analysis and tracking is private if it happens on your device and is not private if leaves your device or happens in the cloud. Hence, it’s built a complex system of tracking and analysis on your iPhone, but is adamant that this is private because the data stays on the device. People have seemed to accept this (so far - or perhaps the just haven’t noticed it), but acting on the same theory Apple also created a CSAM scanning system that it thought was entirely private - ‘it only happens your device!’ - that created a huge privacy backlash, because a bunch of other people think that if your phone is scanning your photos, that isn’t ‘private’ at all. So is ‘on device’ private or not? What’s the rule? What if Apple tried the same model for ‘private’ ads in Safari? How will the public take FLoC? I don’t think we know.
On / off device is one test, but another and much broader is first party / third party: the idea it’s OK for a website to track what you do on that website but not OK for adtech companies to track you across many different websites. This is the core of the cookie question
At this point one answer is to cut across all these questions and say that what really matters is whether you disclose whatever you’re doing and get consent. Steve Jobs liked this argument. But in practice, as we've discovered, ‘get consent’ means endless cookie pop-ups full of endless incomprehensible questions that no normal consumer should be expected to understand, and that just train people to click ‘stop bothering me’. Meanwhile, Apple’s on-device tracking doesn't ask for permission, and opts you in by default, because, of course, Apple thinks that if it's on the device it's private. Perhaps ‘consent’ is not a complete solution after all.
If you can only analyse behaviour within one site but not across many sites, or make it much harder to do that, companies that have a big site where people spend lots of time have better targeting information and make more money from advertising. If you can only track behaviour across lots of different sites if you do it ‘privately’ on the device or in the browser, then the companies that control the device or the browser have much more control over that advertising
·ben-evans.com·
Privacy, ads and confusion — Benedict Evans
Ask HN: Bluetooth kinda sucks. Why don't we have something better? | Hacker News
Ask HN: Bluetooth kinda sucks. Why don't we have something better? | Hacker News
when you try to implement one of these specs you quickly realize that you cannot do it with the spec alone. You need example code, base implementations, test suite software and test data to build conformant software. Unfortunately, the Bluetooth SIG hides these resources behind a membership wall. Guess what happens then? You get lots of implementations of these specs that are a little bit off and don't handle all edge cases.
·news.ycombinator.com·
Ask HN: Bluetooth kinda sucks. Why don't we have something better? | Hacker News
Ad Tech Revenue Statements Indicate Unclear Effects of App Tracking Transparency
Ad Tech Revenue Statements Indicate Unclear Effects of App Tracking Transparency
it is very difficult to figure out what specific effect ATT has because there are so many factors involved
If ATT were so significantly kneecapping revenue, I would think we would see a pronounced skew against North America compared to elsewhere. But that is not the case. Revenue in North America is only slightly off compared to the company total, and it is increasing how much it earns per North American user compared to the rest of the world.
iOS is far more popular in the U.S. and Canada than it is in Europe, but Meta incurred a greater revenue decline — in absolute terms and, especially, in percentage terms — in Europe. Meta was still posting year-over-year gains in both those regions until this most recent quarter, even though ATT rolled out over a year ago.
there are those who believe highly-targeted advertisements are a fair trade-off because they offer businesses a more accurate means of finding their customers, and the behavioural data collected from all of us is valuable only in the aggregate. That is, as I understand it, the view of analysts like Seufert, Benedict Evans, and Ben Thompson. Frequent readers will not be surprised to know I disagree with this premise. Regardless of how many user agreements we sign and privacy policies we read, we cannot know the full extent of the data economy. Personal information about us is being collected, shared, combined, and repackaged. It may only be profitable in aggregate, but it is useful with finer granularity, so it is unsurprising that it is indefinitely warehoused in detail.
Seufert asked, rhetorically, “what happens when ads aren’t personalized?”, answering “digital ads resemble TV ads: jarring distractions from core content experience. Non-personalized is another way of saying irrelevant, or at best, randomly relevant.”
opinion in support or personalized ads
does it make sense to build the internet’s economy on the backs of a few hundred brokers none of us have heard of, trading and merging our personal information in the hope of generating a slightly better click-through rate?
Then there is the much bigger question of whether people should even be able to opt into such widespread tracking. We simply cannot be informed consumers in every aspect of our lives, and we cannot foresee how this information will be used and abused in the full extent of time. It sounds boring, but what is so wrong with requiring data minimization at every turn, permitting only the most relevant personal data to be collected, and restricting the ability for this information to be shared or combined?
Does ATT really “[deprive] consumers of widespread ad relevancy and advertisers and publishers of commercial opportunity”? Even if it does — which I doubt — has that commercial opportunity really existed with meaningful consumer awareness and choice? Or is this entire market illegitimate, artificially inflated by our inability to avoid becoming its subjects?
I've thought this too. Do click through rates really improve so much from targeting that the internet industries' obsession with this practice is justified?
Conflicts like these are one of many reasons why privacy rights should be established by regulators, not individual companies. Privacy must not be a luxury good, or something you opt into, and it should not be a radical position to say so. We all value different degrees of privacy, but it should not be possible for businesses to be built on whether we have rights at all. The digital economy should not be built on such rickety and obviously flawed foundations.
Great and succinct summary of points on user privacy
·pxlnv.com·
Ad Tech Revenue Statements Indicate Unclear Effects of App Tracking Transparency
The Age of Algorithmic Anxiety
The Age of Algorithmic Anxiety
“I’ve been on the internet for the last 10 years and I don’t know if I like what I like or what an algorithm wants me to like,” Peter wrote. She’d come to see social networks’ algorithmic recommendations as a kind of psychic intrusion, surreptitiously reshaping what she’s shown online and, thus, her understanding of her own inclinations and tastes.
Besieged by automated recommendations, we are left to guess exactly how they are influencing us, feeling in some moments misperceived or misled and in other moments clocked with eerie precision.
·newyorker.com·
The Age of Algorithmic Anxiety
Security of iCloud Backup
Security of iCloud Backup
When Messages in iCloud is enabled, iMessage, Apple Messages for Business, text (SMS), and MMS messages are removed from the user’s existing iCloud Backup and are instead stored in an end-to-end encrypted CloudKit container for Messages. The user’s iCloud Backup retains a key to that container. If the user later disables iCloud Backup, that container’s key is rolled, the new key is stored only in iCloud Keychain (inaccessible to Apple and any third parties), and new data written to the container can’t be decrypted with the old container key.
So technically there's a security loophole. If a user has Messages in iCloud enabled, then the user's iCloud backup has special access to an otherwise fully encrypted location for Messages
·support.apple.com·
Security of iCloud Backup