John Gruber - Auteur Theory of Design - Macworld Pulse
Habits, UI changes, and OS stagnation | Riccardo Mori
We have been secretly, for the last 18 months, been designing a completely new user interface. And that user interface builds on Apple’s legacy and carries it into the next century. And we call that new user interface Aqua, because it’s liquid. One of the design goals was that when you saw it you wanted to lick it.
But it’s important to remember that this part came several minutes after outlining Mac OS X’s underlying architecture. Jobs began talking about Mac OS X by stating its goals, then the architecture used to attain those goals, and then there was a mention of how the new OS looked.
Sure, a lot has changed in the technology landscape over the past twenty years, but the Mac OS X introduction in 2000 is almost disarming in how clearly and precisely focused it is. It is framed in such a way that you understand Jobs is talking about a new powerful tool. Sure, it also looks cool, but it feels as if it’s simply a consequence of a grander scheme. A tool can be powerful in itself, but making it attractive and user-friendly is a crucial extension of its power.
But over the years (and to be fair, this started to happen when Jobs was still CEO), I’ve noticed that, iteration after iteration, the focus of each introduction of a new version of Mac OS X shifted towards more superficial features and the general look of the system. As if users were more interested in stopping and admiring just how gorgeous Mac OS looks, rather than having a versatile, robust and reliable foundation with which to operate their computers and be productive.
What some geeks may be shocked to know is that most regular people don’t really care about these changes in the way an application or operating system looks. What matters to them is continuity and reliability. Again, this isn’t being change-averse. Regular users typically welcome change if it brings something interesting to the table and, most of all, if it improves functionality in meaningful ways. Like saving mouse clicks or making a multi-step workflow more intuitive and streamlined.
But making previous features or UI elements less discoverable because you want them to appear only when needed (and who decides when I need something out of the way? Maybe I like to see it all the time) — that’s not progress. It’s change for change’s sake. It’s rearranging the shelves in your supermarket in a way that seems cool and marketable to you but leaves your customers baffled and bewildered.
This yearly cycle forces Apple engineers — and worse, Apple designers — to come up with ‘new stuff’, and this diverts focus from fixing underlying bugs and UI friction that inevitably accumulate over time.
Microsoft may leave entire layers of legacy code in Windows, turning Windows into a mastodontic operating system with a clean surface and decades of baggage underneath. Apple has been cleaning and rearranging the surface for a while now, and has been getting rid of so much baggage that they went to the other extreme. They’ve thrown the baby out with the bathwater, and Mac OS’s user interface has become more brittle after all the changes and inconsistent applications of those Human Interface Guidelines that have informed good UI design in Apple software for so long.
Meanwhile the system hasn’t really gone anywhere. On mobile, iOS started out excitingly, and admittedly still seems to be moving in an evolving trajectory, but on the iPad’s front there has been a lot of wheel reinventing to make the device behave more like a traditional computer, instead of embarking both the device and its operating system in a journey of revolution and redefinition of the tablet experience in order to truly start a ‘Post-PC era’.
An operating system is something that shouldn’t be treated as an ‘app’, or as something people should stop and admire for its æsthetic elegance, or a product whose updates should be marketed as if it’s the next iPhone iteration. An operating system is something that needs a separate, tailored development cycle. Something that needs time so that you can devise an evolution plan about it; so that you can keep working on its robustness by correcting bugs that have been unaddressed for years, and present features that really improve workflows and productivity while building organically on what came before. This way, user-facing UI changes will look reasonable, predictable, intuitive, easily assimilable, and not just arbitrary, cosmetic, and of questionable usefulness.
More stray observations — on Liquid Glass, on Apple’s lack of direction, then zooming out, on technological progress | Riccardo Mori
This Apple has been dismantling Mac OS, as if it’s a foreign tool to them. They’ve bashed its UI around. And they seem to have done that not for the purpose of improving it, but simply for the purpose of changing it; adapting it to their (mostly misguided) idea of unifying the interface of different devices to bring it down to the simplest common denominator.
f we look at Mac OS as a metro railway line, it’s like Apple has stopped extending it and creating new stations. What they’ve been doing for a while now has been routine maintenance, and giving the stations a fresh coat of paint every year. Only basic and cosmetic concerns, yet sometimes mixing things up to show that more work has gone into it, a process that invariably results in inexplicable and arbitrary choices like moving station entrances around, shutting down facilities, making the train timetables less legible, making the passages that lead to emergency exits more convoluted and longer to traverse, and so on — hopefully you know what I mean here.
When you self-impose timelines and cadences that are essentially marketing-driven and do not really reflect technological research and development, then you become prisoner in a prison of your own making. Your goal and your priorities start becoming narrower in scope. You reduce your freedom of movement because you stop thinking in terms of creating the next technological breakthrough or innovative device; you just look at the calendar and you have to come up with something by end of next trimester, while you also have to take care of fixing bugs that are the result of the previous rush job… which keep accumulating on top of the bugs of the rush job that came before, and so forth.
From what I’ve understood by examining the evolution of computer science and computer history, scientists and technologists of past decades seemed to have an approach that could be described as, ‘ideas & concepts first, technology later’. Many figures in the history of computing are rightly considered visionaries because they had visions — sometimes very detailed ones — of what they wanted computers to become, of applications where computers could make a difference, of ways in which a computer could improve a process, or could help solve a real problem.
What I’m seeing today is more like the opposite approach — ‘technology first, ideas & concepts later’: a laser focus on profit-driven technological advancements to hopefully extract some good ideas and use cases from.
Where there are some ideas, or sparks, they seem hopelessly limited in scope or unimaginatively iterative, short-sightedly anchored to the previous incarnation or design. The questions are something like, How can we make this look better, sleeker, more polished?
Steve Jobs once said, There’s an old Wayne Gretzky quote that I love. ‘I skate to where the puck is going to be, not where it has been.’ And we’ve always tried to do that at Apple. Since the very, very beginning. And we always will. If I may take that image, I’d say that today a lot of tech companies seem more concerned with the skating itself and with continuing to hit the puck in profitable ways.
you are what you launch: how software became a lifestyle brand
opening notion or obsidian feels less like launching software and more like putting on your favorite jacket. it says something about you. aligns you with a tribe, becomes part of your identity. software isn’t just functional anymore. it’s quietly turned into a lifestyle brand, a digital prosthetic we use to signal who we are, or who we wish we were.
somewhere along the way, software stopped being invisible. it started meaning things. your browser, your calendar, your to-do list, these are not just tools anymore. they are taste. alignment. self-expression.
Though many people definitely still see software as just software i.e. people who only use defaults
suddenly your app stack said something about you. not in a loud, obvious way but like the kind of shoes you wear when you don’t want people to notice, but still want them to know. margiela replica. new balance 992. arcteryx. stuff that whispers instead of shouts, it’s all about signaling to the right people.
I guess someone only using default software / being 'unopinionated' about what software choices they make is itself a kind of statement along these lines?
notion might be one of the most unopinionated tools out there. you can build practically anything with it. databases, journals, dashboards, even websites. but for a tool so open-ended, it’s surprisingly curated. only three fonts, ten colors.
if notion is a sleek apartment in seoul, obsidian is a cluttered home lab. markdown files. local folders. keyboard shortcuts. graph views. it doesn’t care how it looks, it cares that it works. it’s functional first, aesthetic maybe never. there’s no onboarding flow, no emoji illustrations, no soft gradients telling you everything’s going to be okay. just an empty vault and the quiet suggestion: you figure it out.
obsidian is built for tinkerers. not in the modern, drag and drop sense but in the old way. the “i wanna see how this thing works under the hood way”. it’s a tool that rewards curiosity and exploration. everything in obsidian feels like it was made by someone who didn’t just want to take notes, they wanted to build the system that takes notes. it’s messy, it’s endless, and that’s the point. it’s a playground for people who believe that the best tools are the ones you shape yourself.
notion is for people who want a beautiful space to live in, obsidian is for people who want to wire the whole building from scratch. both offer freedom, but one is curated and the other is raw.
obsidian and notion don’t just attract different users.
they attract different lifestyles.
the whole obsidian ecosystem runs on a kind of quiet technical fluency.
the fact that people think obsidian is open source matters more than whether it actually is. because open source, in this context, isn’t just a licence, it’s a vibe. it signals independence. self-reliance. a kind of technical purity. using obsidian says: i care about local files. i care about control. i care enough to make things harder on myself. and that is a lifestyle.
now, there’s a “premium” version of everything. superhuman for email. cron (i don’t wanna call it notion calendar) for calendars. arc for browsing. raycast for spotlight. even perplexity, somehow, for search.
these apps aren’t solving new problems. they’re solving old ones with better fonts. tighter animations, cleaner onboarding. they’re selling taste.
chrome gets the job done, but arc gets you. the onboarding feels like a guided meditation. it’s not about speed or performance. it’s about posture.
arc makes you learn new gestures. it hides familiar things. it’s not trying to be invisible, it wants to be felt. same with linear. same with superhuman. these apps add friction on purpose. like doc martens or raw denim that needs breaking in.
linear even has a “work with linear” page, a curated list of companies that use their tool. it’s a perfect example of companies not just acknowledging their lifestyle brand status, but actively leaning into it as a recruiting and signaling mechanism.
Taste is Eating Silicon Valley.
The lines between technology and culture are blurring. And so, it’s no longer enough to build great tech.
Whether in expressed via product design, brand, or user experience, taste now defines how a product is perceived and felt as well as how it is adopted, i.e. distributed — whether it’s software or hardware or both.
Technology has become deeply intertwined with culture.3 People now engage with technology as part of their lives, no matter their location, career, or status.
founders are realizing they have to do more than code, than be technical. Utility is always key, but founders also need to calibrate design, brand, experience, storytelling, community — and cultural relevance.
The likes of Steve Jobs and Elon Musk are admired not just for their technical innovations but for the way they turned their products, and themselves, into cultural icons.
The elevation of taste invites a melting pot of experiences and perspectives into the arena — challenging “legacy” Silicon Valley from inside and outside.
B2C sectors that once prioritized functionality and even B2B software now feel the pull of user experience, design, aesthetics, and storytelling.
Arc is taking on legacy web browsers with design and brand as core selling points. Tools like Linear, a project management tool for software teams, are just as known for their principled approach to company building and their heavily-copied landing page design as they are known for their product’s functionality.4 Companies like Arc and Linear build an entire aesthetic ecosystem that invites users and advocates to be part of their version of the world, and to generate massive digital and literal word-of-mouth. (Their stories are still unfinished but they stand out among this sector in Silicon Valley.)
Any attempt to give examples of taste will inevitably be controversial, since taste is hard to define and ever elusive. These examples are pointing at narratives around taste within a community.
So how do they compete? On how they look, feel, and how they make users feel.6 The subtleties of interaction (how intuitive, friendly, or seamless the interface feels) and the brand aesthetic (from playful websites to marketing messages) are now differentiators, where users favor tools aligned with their personal values. All of this should be intertwined in a product, yet it’s still a noteworthy distinction.
Investors can no longer just fund the best engineering teams and wait either.
They’re looking for teams that can capture cultural relevance and reflect the values, aesthetics, and tastes of their increasingly diverse markets.
How do investors position themselves in this new landscape? They bet on taste-driven founders who can capture the cultural zeitgeist. They build their own personal and firm brands too. They redesign their websites, write manifestos, launch podcasts, and join forces with cultural juggernauts.
Code is cheap. Money now chases utility wrapped in taste, function sculpted with beautiful form, and technology framed in artistry.
The dictionary says it’s the ability to discern what is of good quality or of a high aesthetic standard. Taste bridges personal choice (identity), societal standards (culture), and the pursuit of validation (attention).
But who sets that standard? Taste is subjective at an individual level — everyone has their own personal interpretation of taste — but it is calibrated from within a given culture and community.
Taste manifests as a combination of history, design, user experience, and embedded values that creates emotional resonance — that defines how a product connects with people as individuals and aligns with their identity. None of the tactical things alone are taste; they’re mere artifacts or effects of expressing one’s taste. At a minimum, taste isn’t bland — it’s opinionated.
The most compelling startups will be those that marry great tech with great taste. Even the pursuit of unlocking technological breakthroughs must be done with taste and cultural resonance in mind, not just for the sake of the technology itself. Taste alone won’t win, but you won’t win without taste playing a major role.
Founders must now master cultural resonance alongside technical innovation.
In some sectors—like frontier AI, deep tech, cybersecurity, industrial automation—taste is still less relevant, and technical innovation remains the main focus. But the footprint of sectors where taste doesn’t play a big role is shrinking. The most successful companies now blend both. Even companies aiming to be mainstream monopolies need to start with a novel opinionated approach.
I think we should leave it at “taste” which captures the artistic and cultural expressions that traditional business language can’t fully convey, reflecting the deep-rooted and intuitive aspects essential for product dev
Something Is Rotten in the State of Cupertino
Who decided these features should go in the WWDC keynote, with a promise they’d arrive in the coming year, when, at the time, they were in such an unfinished state they could not be demoed to the media even in a controlled environment? Three months later, who decided Apple should double down and advertise these features in a TV commercial, and promote them as a selling point of the iPhone 16 lineup — not just any products, but the very crown jewels of the company and the envy of the entire industry — when those features still remained in such an unfinished or perhaps even downright non-functional state that they still could not be demoed to the press? Not just couldn’t be shipped as beta software. Not just couldn’t be used by members of the press in a hands-on experience, but could not even be shown to work by Apple employees on Apple-controlled devices in an Apple-controlled environment? But yet they advertised them in a commercial for the iPhone 16, when it turns out they won’t ship, in the best case scenario, until months after the iPhone 17 lineup is unveiled?
“Can anyone tell me what MobileMe is supposed to do?” Having received a satisfactory answer, he continued, “So why the fuck doesn’t it do that?”
For the next half-hour Jobs berated the group. “You’ve tarnished Apple’s reputation,” he told them. “You should hate each other for having let each other down.” The public humiliation particularly infuriated Jobs. Walt Mossberg, the influential Wall Street Journal gadget columnist, had panned MobileMe. “Mossberg, our friend, is no longer writing good things about us,” Jobs said. On the spot, Jobs named a new executive to run the group.
Tim Cook should have already held a meeting like that to address and rectify this Siri and Apple Intelligence debacle. If such a meeting hasn’t yet occurred or doesn’t happen soon, then, I fear, that’s all she wrote. The ride is over. When mediocrity, excuses, and bullshit take root, they take over. A culture of excellence, accountability, and integrity cannot abide the acceptance of any of those things, and will quickly collapse upon itself with the acceptance of all three.
Ask HN: Can I really create a company around my open-source software? | Hacker News
I get that you've worked on this for months, that you're burned out generally, and now unemployed. So this comment is not meant as "mean" but rather offered in the spirit of encouragement.
Firstly, building a business (especially in a crowded space) is stressful. It's not a place to recover from burnout. It's not a place that reduces anxiety. So my first recommendation is to relax a bit, put this on the back burner, and when you're ready go look for your next job.
Secondly, treat this project as an education. You had an idea and spent months implementing it. That's the easy part. The hard part is finding a market willing to pay money for something.
So for your next project do the hard part first. First find a market, find out what they will spend, ideally collect a small deposit (to prove they're serious) and then go from there.
In my business we have 3 main product lines. The first 2 happened because the market paid us to build a solution. We iterated on those for 30 years, and we now are big players (in very niche spaces.)
The 3rd happened as a take-over of a project by another retiring developer. He had a few customers, and a good product, but in a crowded space where there's lots of reasons not to change. It's taken many years to build it out, despite being clearly better than the competition, and it's still barely profitable (if you ignore a bunch of expenses paid by the whole business. )
The lesson being to follow the money, not the idea. (Aside, early on we followed some ideas, all those projects died, most without generating any revenue.)
So congratulations to seeing something through to release. But turning a product into a business is really hard. Turning a commodity like this into a business is almost impossible.
I wish you well in your future endeavors.
For a major commercial product I visited similar markets to ours, knocked on the doors of distributors, tried to find people who wanted to integrate our product into their market. I failed a lot but succeeded twice, and those 2 have been paying us lots of money every year for 20 years as they make sales.
Your approach may vary. Start locally. Talk to shop keepers, restaurants, businesses, charities, schools and so on. Look for markets that are not serviced (which is different to where the person is just too cheap, or adverse to tech for other reasons.)
Of course it's a LOT harder now to find unserviced markets. There's a lot more software out there now than there was when I started out. Ultimately though it's about connecting with people - real people not just sending out spam emails. And so meeting the right person at the right time is "lucky". But if you're not out there luck can't work with you. You need to give luck a chance.
A.I. Is the New Annoying Ad You Will See Everywhere – Pixel Envy
Ever since software updates became distributed regularly as part of the SaaS business model, it has become the vendors’ priority to show how clever they are through callouts, balloons, dialogs, toasts, and other in-product advertising. I understand why vendors want users to know about new features. But these promotions are way too much and way too often. Respecting users has long been deprioritized in favour of whatever new thing leads to promotions and bonuses.
Local-first software: You own your data, in spite of the cloud
While cloud apps have become dominant due to their collaborative features, they often compromise user ownership and data longevity. Local-first software seeks to provide a better alternative by prioritizing local storage and networks while still enabling seamless collaboration. The article outlines seven ideals for local-first software, discusses existing technologies, and proposes Conflict-free Replicated Data Types (CRDTs) as a promising foundation for realizing these ideals.
Cloud apps like Google Docs and Trello are popular because they enable real-time collaboration with colleagues, and they make it easy for us to access our work from all of our devices. However, by centralizing data storage on servers, cloud apps also take away ownership and agency from users. If a service shuts down, the software stops functioning, and data created with that software is lost.
In this article we propose “local-first software”: a set of principles for software that enables both collaboration and ownership for users. Local-first ideals include the ability to work offline and collaborate across multiple devices, while also improving the security, privacy, long-term preservation, and user control of data.
This article has also been published in PDF format in the proceedings of the Onward! 2019 conference. Please cite it as:
Martin Kleppmann, Adam Wiggins, Peter van Hardenberg, and Mark McGranaghan. Local-first software: you own your data, in spite of the cloud. 2019 ACM SIGPLAN International Symposium on New Ideas, New Paradigms, and Reflections on Programming and Software (Onward!), October 2019, pages 154–178. doi:10.1145/3359591.3359737
To sum up: the cloud gives us collaboration, but old-fashioned apps give us ownership. Can’t we have the best of both worlds?
We would like both the convenient cross-device access and real-time collaboration provided by cloud apps, and also the personal ownership of your own data embodied by “old-fashioned” software.
In old-fashioned apps, the data lives in files on your local disk, so you have full agency and ownership of that data: you can do anything you like, including long-term archiving, making backups, manipulating the files using other programs, or deleting the files if you no longer want them. You don’t need anybody’s permission to access your files, since they are yours. You don’t have to depend on servers operated by another company.
In cloud apps, the data on the server is treated as the primary, authoritative copy of the data; if a client has a copy of the data, it is merely a cache that is subordinate to the server. Any data modification must be sent to the server, otherwise it “didn’t happen.” In local-first applications we swap these roles: we treat the copy of the data on your local device — your laptop, tablet, or phone — as the primary copy. Servers still exist, but they hold secondary copies of your data in order to assist with access from multiple devices. As we shall see, this change in perspective has profound implications.
For several years the Offline First movement has been encouraging developers of web and mobile apps to improve offline support, but in practice it has been difficult to retrofit offline support to cloud apps, because tools and libraries designed for a server-centric model do not easily adapt to situations in which users make edits while offline.
In local-first apps, our ideal is to support real-time collaboration that is on par with the best cloud apps today, or better. Achieving this goal is one of the biggest challenges in realizing local-first software, but we believe it is possible
Some file formats (such as plain text, JPEG, and PDF) are so ubiquitous that they will probably be readable for centuries to come. The US Library of Congress also recommends XML, JSON, or SQLite as archival formats for datasets. However, in order to read less common file formats and to preserve interactivity, you need to be able to run the original software (if necessary, in a virtual machine or emulator). Local-first software enables this.
Of these, email attachments are probably the most common sharing mechanism, especially among users who are not technical experts. Attachments are easy to understand and trustworthy. Once you have a copy of a document, it does not spontaneously change: if you view an email six months later, the attachments are still there in their original form. Unlike a web app, an attachment can be opened without any additional login process.
The weakest point of email attachments is collaboration. Generally, only one person at a time can make changes to a file, otherwise a difficult manual merge is required. File versioning quickly becomes messy: a back-and-forth email thread with attachments often leads to filenames such as Budget draft 2 (Jane's version) final final 3.xls.
Web apps have set the standard for real-time collaboration. As a user you can trust that when you open a document on any device, you are seeing the most current and up-to-date version. This is so overwhelmingly useful for team work that these applications have become dominant.
The flip side to this is a total loss of ownership and control: the data on the server is what counts, and any data on your client device is unimportant — it is merely a cache
We think the Git model points the way toward a future for local-first software. However, as it currently stands, Git has two major weaknesses:
Git is excellent for asynchronous collaboration, especially using pull requests, which take a coarse-grained set of changes and allow them to be discussed and amended before merging them into the shared master branch. But Git has no capability for real-time, fine-grained collaboration, such as the automatic, instantaneous merging that occurs in tools like Google Docs, Trello, and Figma.
Git is highly optimized for code and similar line-based text files; other file formats are treated as binary blobs that cannot meaningfully be edited or merged. Despite GitHub’s efforts to display and compare images, prose, and CAD files, non-textual file formats remain second-class in Git.
A web app in its purest form is usually a Rails, Django, PHP, or Node.js program running on a server, storing its data in a SQL or NoSQL database, and serving web pages over HTTPS. All of the data is on the server, and the user’s web browser is only a thin client.
This architecture offers many benefits: zero installation (just visit a URL), and nothing for the user to manage, as all data is stored and managed in one place by the engineering and DevOps professionals who deploy the application. Users can access the application from all of their devices, and colleagues can easily collaborate by logging in to the same application.
JavaScript frameworks such as Meteor and ShareDB, and services such as Pusher and Ably, make it easier to add real-time collaboration features to web applications, building on top of lower-level protocols such as WebSocket.
On the other hand, a web app that needs to perform a request to a server for every user action is going to be slow. It is possible to hide the round-trip times in some cases by using client-side JavaScript, but these approaches quickly break down if the user’s internet connection is unstable.
As we have shown, none of the existing data layers for application development fully satisfy the local-first ideals. Thus, three years ago, our lab set out to search for a solution that gives seven green checkmarks.
*Fast, multi-device, offline, collaboration, longevity, privacy, and user control
Thus, CRDTs have some similarity to version control systems like Git, except that they operate on richer data types than text files. CRDTs can sync their state via any communication channel (e.g. via a server, over a peer-to-peer connection, by Bluetooth between local devices, or even on a USB stick). The changes tracked by a CRDT can be as small as a single keystroke, enabling Google Docs-style real-time collaboration. But you could also collect a larger set of changes and send them to collaborators as a batch, more like a pull request in Git. Because the data structures are general-purpose, we can develop general-purpose tools for storage, communication, and management of CRDTs, saving us from having to re-implement those things in every single app.
we believe that CRDTs have the potential to be a foundation for a new generation of software. Just as packet switching was an enabling technology for the Internet and the web, or as capacitive touchscreens were an enabling technology for smartphones, so we think CRDTs may be the foundation for collaborative software that gives users full ownership of their data.
We are often asked about the effectiveness of automatic merging, and many people assume that application-specific conflict resolution mechanisms are required. However, we found that users surprisingly rarely encounter conflicts in their work when collaborating with others, and that generic resolution mechanisms work well. The reasons for this are:
Automerge tracks changes at a fine-grained level, and takes datatype semantics into account. For example, if two users concurrently insert items at the same position into an array, Automerge combines these changes by positioning the two new items in a deterministic order. In contrast, a textual version control system like Git would treat this situation as a conflict requiring manual resolution.
Users have an intuitive sense of human collaboration and avoid creating conflicts with their collaborators. For example, when users are collaboratively editing an article, they may agree in advance who will be working on which section for a period of time, and avoid concurrently modifying the same section.
Conflicts arise only if users concurrently modify the same property of the same object: for example, if two users concurrently change the position of the same image object on a canvas. In such cases, it is often arbitrary how they are resolved and satisfactory either way.
We experimented with a number of mechanisms for sharing documents with other users, and found that a URL model, inspired by the web, makes the most sense to users and developers. URLs can be copied and pasted, and shared via communication channels such as email or chat. Access permissions for documents beyond secret URLs remain an open research question.
As with a Git repository, what a particular user sees in the “master” branch is a function of the last time they communicated with other users. Newly arriving changes might unexpectedly modify parts of the document you are working on, but manually merging every change from every user is tedious. Decentralized documents enable users to be in control over their own data, but further study is needed to understand what this means in practical user-interface terms.
Performance and memory/disk usage quickly became a problem because CRDTs store all history, including character-by-character text edits. These pile up, but can’t easily be truncated because it’s impossible to know when someone might reconnect to your shared document after six months away and need to merge changes from that point forward.
Servers thus have a role to play in the local-first world — not as central authorities, but as “cloud peers” that support client applications without being on the critical path. For example, a cloud peer that stores a copy of the document, and forwards it to other peers when they come online, could solve the closed-laptop problem above.
These experiments suggest that local-first software is possible. Collaboration and ownership are not at odds with each other — we can get the best of both worlds, and users can benefit.
However, the underlying technologies are still a work in progress. They are good for developing prototypes, and we hope that they will evolve and stabilize in the coming years, but realistically, it is not yet advisable to replace a proven product like Firebase with an experimental project like Automerge in a production setting today.
Most CRDT research operates in a model where all collaborators immediately apply their edits to a single version of a document. However, practical local-first applications require more flexibility: users must have the freedom to reject edits made by another collaborator, or to make private changes to a version of the document that is not shared with others. A user might want to apply changes speculatively or reformat their change history. These concepts are well understood in the distributed source control world as “branches,” “forks,” “rebasing,” and so on. There is little work to date on understanding the algorithms and programming models for collaboration in situations where multiple document versions and branches exist side-by-side.
Different collaborators may be using different versions of an application, potentially with different features. As there is no central database server, there is no authoritative “current” schema for the data. How can we write software so that varying application versions can safely interoperate, even as data formats evolve? This question has analogues in cloud-based API design, but a local-first setting provides additional challenges.
When every document can develop a complex version history, simply through daily operation, an acute problem arises: how do we communicate this version history to users? How should users think about versioning, share and accept changes, and understand how their documents came to be a certain way when there is no central source of truth? Today there are two mainstream models for change management: a source-code model of diffs and patches, and a Google Docs model of suggestions and comments. Are these the best we can do? How do we generalize these ideas to data formats that are not text?
We believe that the assumption of centralization is deeply ingrained in our user experiences today, and we are only beginning to discover the consequences of changing that assumption. We hope these open questions will inspire researchers to explore what we believe is an untapped area.
some strategies for improving each area:
Fast. Aggressive caching and downloading resources ahead of time can be a way to prevent the user from seeing spinners when they open your app or a document they previously had open. Trust the local cache by default instead of making the user wait for a network fetch.
Multi-device. Syncing infrastructure like Firebase and iCloud make multi-device support relatively painless, although they do introduce longevity and privacy concerns. Self-hosted infrastructure like Realm Object Server provides an alternative trade-off.
Offline. In the web world, Progressive Web Apps offer features like Service Workers and app manifests that can help. In the mobile world, be aware of WebKit frames and other network-dependent components. Test your app by turning off your WiFi, or using traffic shapers such as the Chrome Dev Tools network condition simulator or the iOS network link conditioner.
Collaboration. Besides CRDTs, the more established technology for real-time collaboration is Operational Transformation (OT), as implemented e.g. in ShareDB.
Longevity. Make sure your software can easily export to flattened, standard formats like JSON or PDF. For example: mass export such as Google Takeout; continuous backup into stable file formats such as in GoodNotes; and JSON download of documents such as in Trello.
Privacy. Cloud apps are fundamentally non-private, with employees of the company and governments able to peek at user data at any time. But for mobile or desktop applications, try to make clear to users when the data is stored only on their device versus being transmitted to a backend.
User control. Can users easily back up, duplicate, or delete some or all of their documents within your application? Often this involves re-implementing all the basic filesystem operations, as Google Docs has done with Google Drive.
If you are an entrepreneur interested in building developer infrastructure, all of the above suggests an interesting market opportunity: “Firebase for CRDTs.”
Such a startup would need to offer a great developer experience and a local persistence library (something like SQLite or Realm). It would need to be available for mobile platforms (iOS, Android), native desktop (Windows, Mac, Linux), and web technologies (Electron, Progressive Web Apps).
User control, privacy, multi-device support, and collaboration would all be baked in. Application developers could focus on building their app, knowing that the easiest implementation path would also given them top marks on the local-first scorecard. As litmus test to see if you have succeeded, we suggest: do all your customers’ apps continue working in perpetuity, even if all servers are shut down?
We believe the “Firebase for CRDTs” opportunity will be huge as CRDTs come of age.
In the pursuit of better tools we moved many applications to the cloud. Cloud software is in many regards superior to “old-fashioned” software: it offers collaborative, always-up-to-date applications, accessible from anywhere in the world. We no longer worry about what software version we are running, or what machine a file lives on.
However, in the cloud, ownership of data is vested in the servers, not the users, and so we became borrowers of our own data. The documents created in cloud apps are destined to disappear when the creators of those services cease to maintain them. Cloud services defy long-term preservation. No Wayback Machine can restore a sunsetted web application. The Internet Archive cannot preserve your Google Docs.
In this article we explored a new way forward for software of the future. We have shown that it is possible for users to retain ownership and control of their data, while also benefiting from the features we associate with the cloud: seamless collaboration and access from anywhere. It is possible to get the best of both worlds.
But more work is needed to realize the local-first approach in practice. Application developers can take incremental steps, such as improving offline support and making better use of on-device storage. Researchers can continue improving the algorithms, programming models, and user interfaces for local-first software. Entrepreneurs can develop foundational technologies such as CRDTs and peer-to-peer networking into mature products able to power the next generation of applications.
Today it is easy to create a web application in which the server takes ownership of all the data. But it is too hard to build collaborative software that respects users’ ownership and agency. In order to shift the balance, we need to improve the tools for developing local-first software. We hope that you will join us.
New Apple Stuff and the Regular People
"Will it be different?" is the key question the regular people ask. They don't want there to be extra steps or new procedures. They sure as hell don't want the icons to look different or, God forbid, be moved to a new place.
These bright and capable people who will one day help you through knee replacement surgery all bought a Mac when they were college frehmen and then they never updated it. Almost all of them had the default programs still in the dock. They are regular users. You with all your fancy calendars, note taking apps and your customized terminal are an outlier. Never forget.
The majority of iPhone users and Mac owners have no idea what's coming though. They are going to wake up on Monday to an unwelcome notification that there is an update available. Many of them will ask their techie friends (like you) if there is a way to make the update notification go away. They will want to know if they have to install it.
Spreadsheet Assassins | Matthew King
Rhe real key to SaaS success is often less about innovative software and more about locking in customers and extracting maximum value. Many SaaS products simply digitize spreadsheet workflows into proprietary systems, making it difficult for customers to switch. As SaaS proliferates into every corner of the economy, it imposes a growing "software tax" on businesses and consumers alike. While spreadsheets remain a flexible, interoperable stalwart, the trajectory of SaaS points to an increasingly extractive model prioritizing rent-seeking over genuine productivity gains.
As a SaaS startup scales, sales and customer support staff pay for themselves, and the marginal cost to serve your one-thousandth versus one-millionth user is near-zero. The result? Some SaaS companies achieve gross profit margins of 75 to 90 percent, rivaling Windows in its monopolistic heyday.
Rent-seeking has become an explicit playbook for many shameless SaaS investors. Private equity shop Thoma Bravo has acquired over four hundred software companies, repeatedly mashing products together to amplify lock-in effects so it can slash costs and boost prices—before selling the ravaged Franken-platform to the highest bidder.
In the Kafkaesque realm of health care, software giant Epic’s 1990s-era UI is still widely used for electronic medical records, a nuisance that arguably puts millions of lives at risk, even as it accrues billions in annual revenue and actively resists system interoperability. SAP, the antiquated granddaddy of enterprise resource planning software, has endured for decades within frustrated finance and supply chain teams, even as thousands of SaaS startups try to chip away at its dominance. Salesforce continues to grow at a rapid clip, despite a clunky UI that users say is “absolutely terrible” and “stuck in the 80s”—hence, the hundreds of “SalesTech” startups that simplify a single platform workflow (and pray for a billion-dollar acquihire to Benioff’s mothership). What these SaaS overlords might laud as an ecosystem of startup innovation is actually a reflection of their own technical shortcomings and bloated inertia.
Over 1,500 software startups are focused on billing and invoicing alone. The glut of tools extends to sectors without any clear need for complex software: no fewer than 378 hair salon platforms, 166 parking management solutions, and 70 operating systems for funeral homes and cemeteries are currently on the market. Billions of public pension and university endowment dollars are being burned on what amounts to hackathon curiosities, driven by the machinations of venture capital and private equity. To visit a much-hyped “demo day” at a startup incubator like Y Combinator or Techstars is to enter a realm akin to a high-end art fair—except the objects being admired are not texts or sculptures or paintings but slightly nicer faces for the drudgery of corporate productivity.
As popular as SaaS has become, much of the modern economy still runs on the humble, unfashionable spreadsheet. For all its downsides, there are virtues. Spreadsheets are highly interoperable between firms, partly because of another monopoly (Excel) but also because the generic .csv format is recognized by countless applications. They offer greater autonomy and flexibility, with tabular cells and formulas that can be shaped into workflows, processes, calculators, databases, dashboards, calendars, to-do lists, bug trackers, accounting workbooks—the list goes on. Spreadsheets are arguably the most popular programming language on Earth.
The most hated workplace software on the planet
LinkedIn, Reddit, and Blind abound with enraged job applicants and employees sharing tales of how difficult it is to book paid leave, how Kafkaesque it is to file an expense, how nerve-racking it is to close out a project. "I simply hate Workday. Fuck them and those who insist on using it for recruitment," one Reddit user wrote. "Everything is non-intuitive, so even the simplest tasks leave me scratching my head," wrote another. "Keeping notes on index cards would be more effective." Every HR professional and hiring manager I spoke with — whose lives are supposedly made easier by Workday — described Workday with a sense of cosmic exasperation.
If candidates hate Workday, if employees hate Workday, if HR people and managers processing and assessing those candidates and employees through Workday hate Workday — if Workday is the most annoying part of so many workers' workdays — how is Workday everywhere? How did a software provider so widely loathed become a mainstay of the modern workplace?
This is a saying in systems thinking: The purpose of a system is what it does (POSIWID), not what it fails to do. And the reality is that what Workday — and its many despised competitors — does for organizations is far more important than the anguish it causes everyone else.
In 1988, PeopleSoft, backed by IBM, built the first fully fledged Human Resources Information System. In 2004, Oracle acquired PeopleSoft for $10.3 billion. One of its founders, David Duffield, then started a new company that upgraded PeopleSoft's model to near limitless cloud-based storage — giving birth to Workday, the intractable nepo baby of HR software.
Workday is indifferent to our suffering in a job hunt, because we aren't Workday's clients, companies are. And these companies — from AT&T to Bank of America to Teladoc — have little incentive to care about your application experience, because if you didn't get the job, you're not their responsibility. For a company hiring and onboarding on a global scale, it is simply easier to screen fewer candidates if the result is still a single hire.
A search on a job board can return hundreds of listings for in-house Workday consultants: IT and engineering professionals hired to fix the software promising to fix processes.
For recruiters, Workday also lacks basic user-interface flexibility. When you promise ease-of-use and simplicity, you must deliver on the most basic user interactions. And yet: Sometimes searching for a candidate, or locating a candidate's status feels impossible. This happens outside of recruiting, too, where locating or attaching a boss's email to approve an expense sheet is complicated by the process, not streamlined. Bureaucratic hell is always about one person's ease coming at the cost of someone else's frustration, time wasted, and busy work. Workday makes no exceptions.
Workday touts its ability to track employee performance by collecting data and marking results, but it is employees who must spend time inputting this data. A creative director at a Fortune 500 company told me how in less than two years his company went "from annual reviews to twice-annual reviews to quarterly reviews to quarterly reviews plus separate twice-annual reviews." At each interval higher-ups pressed HR for more data, because they wanted what they'd paid for with Workday: more work product. With a press of a button, HR could provide that, but the entire company suffered thousands more hours of busy work. Automation made it too easy to do too much. (Workday's "customers choose the frequency at which they conduct reviews, not Workday," said the spokesperson.)
At the scale of a large company, this is simply too much work to expect a few people to do and far too user-specific to expect automation to handle well. It's why Workday can be the worst while still allowing that Paychex is the worst, Paycom is the worst, Paycor is the worst, and Dayforce is the worst. "HR software sucking" is a big tent.
Workday finds itself between enshittification steps two and three. The platform once made things faster, simpler for workers. But today it abuses workers by cutting corners on job-application and reimbursement procedures. In the process, it provides the value of a one-stop HR shop to its paying customers. It seems it's only a matter of time before Workday and its competitors try to split the difference and cut those same corners with the accounts that pay their bills.
Workday reveals what's important to the people who run Fortune 500 companies: easily and conveniently distributing busy work across large workforces. This is done with the arbitrary and perfunctory performance of work tasks (like excessive reviews) and with the throttling of momentum by making finance and HR tasks difficult. If your expenses and reimbursements are difficult to file, that's OK, because the people above you don't actually care if you get reimbursed. If it takes applicants 128% longer to apply, the people who implemented Workday don't really care. Throttling applicants is perhaps not intentional, but it's good for the company.
The Mac Turns Forty – Pixel Envy
As for a Hall of Shame thing? That would be the slow but steady encroachment of single-window applications in MacOS, especially via Catalyst and Electron. The reason I gravitated toward MacOS in the first place is the same reason I continue to use it: it fits my mental model of how an operating system ought to work.
Quality software deserves your hard‑earned cash
Quality software from independent makers is like quality food from the farmer’s market. A jar of handmade organic jam is not the same as mass-produced corn syrup-laden jam from the supermarket.
Industrial fruit jam is filled with cheap ingredients and shelf stabilizers. Industrial software is filled with privacy-invasive trackers and proprietary formats.
Google, Apple, and Microsoft make industrial software. Like industrial jam, industrial software has its benefits — it’s cheap, fairly reliable, widely available, and often gets the job done.
Big tech companies have the ability to make their software cheap by subsidizing costs in a variety of ways:
Google sells highly profitable advertising and makes its apps free, but you are subjected to ads and privacy-invasive tracking.
Apple sells highly profitable devices and makes its apps free, but locks you into a proprietary ecosystem.
Microsoft sells highly profitable enterprise contracts using a bundling strategy, and makes its apps cheap, also locking you into a proprietary ecosystem.
I’m not saying these companies are evil. But their subsidies create the illusion that all software should be cheap or free.
Independent makers of quality software go out of their way to make apps that are better for you. They take a principled approach to making tools that don’t compromise your privacy, and don’t lock you in.
Independent software makers are people you can talk to. Like quality jam from the farmer’s market, you might become friends with the person who made it — they’ll listen to your suggestions and your complaints.
Big tech companies earn hundreds of billions of dollars and employ hundreds of thousands of people. When they make a new app, they can market it to their billions of customers easily. They have unbeatable leverage over the cost of developing and maintaining their apps.
Making Our Hearts Sing - Discussion on Hacker News
A lot of people see software as a list of features, hardware as a list of specs. But when you think about how much time we spend with these things, maybe they just aren’t that utilitarian. We think of buildings not just as volumes of conditioned air — but also as something architected, as something that can have a profound effect on how you feel, something that can have value in itself (historical buildings and such).
How Google Docs Proved the Power of Less | WIRED
r/MacOSBeta - Comment by u/adh1003 on ”Is it just me or did macOS already peak?”
Apple’s use of AppKit, Mac Catalyst and SwiftUI in macOS
Compilation of good macOS Software
Keyboard Shortcuts for Everything