OUTSIDERS social signals 2024_v5
Muse retrospective by Adam Wiggins
- Wiggins focused on storytelling and brand-building for Muse, achieving early success with an email newsletter, which helped engage potential users and refine the product's value proposition.
- Muse aspired to a "small giants" business model, emphasizing quality, autonomy, and a healthy work environment over rapid growth. They sought to avoid additional funding rounds by charging a prosumer price early on.
- Short demo videos on Twitter showcasing the app in action proved to be the most effective method for attracting new users.
Muse as a brand and a product represented something aspirational. People want to be deeper thinkers, to be more strategic, and to use cool, status-quo challenging software made by small passionate teams. These kinds of aspirations are easier to indulge in times of plenty. But once you're getting laid off from your high-paying tech job, or struggling to raise your next financing round, or scrambling to protect your kids' college fund from runaway inflation and uncertain markets... I guess you don't have time to be excited about cool demos on Twitter and thoughtful podcasts on product design.
I’d speculate that another factor is the half-life of cool new productivity software. Evernote, Slack, Notion, Roam, Craft, and many others seem to get pretty far on community excitement for their first few years. After that, I think you have to be left with software that serves a deep and hard-to-replace purpose in people’s lives. Muse got there for a few thousand people, but the economics of prosumer software means that just isn’t enough. You need tens of thousands, hundreds of thousands, to make the cost of development sustainable.
We envisioned Muse as the perfect combination of the freeform elements of a whiteboard, the structured text-heavy style of Notion or Google Docs, and the sense of place you get from a “virtual office” ala group chat. As a way to asynchronously trade ideas and inspiration, sketch out project ideas, and explore possibilities, the multiplayer Muse experience is, in my honest opinion, unparalleled for small creative teams working remotely.
But friction began almost immediately. The team lead or organizer was usually the one bringing Muse to the team, and they were already a fan of its approach. But the other team members are generally a little annoyed to have to learn any new tool, and Muse’s steeper learning curve only made that worse. Those team members would push the problem back to the team lead, treating them as customer support (rather than contacting us directly for help). The team lead often felt like too much of the burden of pushing Muse adoption was on their shoulders.
This was in addition to the obvious product gaps, like: no support for the web or Windows; minimal or no integration with other key tools like Notion and Google Docs; and no permissions or support for multiple workspaces. Had we raised $10M back during the cash party of 2020–2021, we could have hired the 15+ person team that would have been necessary to build all of that. But with only seven people (we had added two more people to the team in 2021–2022), it just wasn’t feasible.
We neither focused on a particular vertical (academics, designers, authors...) or a narrow use case (PDF reading/annotation, collaborative whiteboarding, design sketching...). That meant we were always spread pretty thin in terms of feature development, and marketing was difficult even over and above the problem of explaining canvas software and digital thinking tools.
being general-purpose was in its blood from birth. Part of it was maker's hubris: don't we always dream of general-purpose tools that will be everything to everyone? And part of it was that it's truly the case that Muse excels at the ability to combine together so many different related knowledge tasks and media types into a single, minimal, powerful canvas. Not sure what I would do differently here, even with the benefit of hindsight.
Muse built a lot of its reputation on being principled, but we were maybe too cautious to do the mercenary things that help you succeed. A good example here is asking users for ratings; I felt like this was not to user benefit and distracting when the user is trying to use your app. Our App Store rating was on the low side (~3.9 stars) for most of our existence. When we finally added the standard prompt-for-rating dialog, it instantly shot up to ~4.7 stars. This was a small example of being too principled about doing good for the user, and not thinking about what would benefit our business.
Growing the team slowly was a delight. At several previous ventures, I've onboard people in the hiring-is-job-one environment of a growth startup. At Muse, we started with three founders and then hired roughly one person per year. This was absolutely fantastic for being able to really take our time to find the perfect person for the role, and then for that person to have tons of time to onboard and find their footing on the team before anyone new showed up. The resulting team was the best I've ever worked on, with minimal deadweight or emotional baggage.
ultimately your product does have to have some web presence. My biggest regret is not building a simple share-to-web function early on, which could have created some virality and a great deal of utility for users as well.
In terms of development speed, quality of the resulting product, hardware integration, and a million other things: native app development wins.
After decades working in product development, being on the marketing/brand/growth/storytelling side was a huge personal challenge for me. But I feel like I managed to grow into the role and find my own approach (podcasting, demo videos, etc) to create a beacon to attract potential customers to our product.
when it comes time for an individual or a team to sit down and sketch out the beginnings of a new business, a new book, a new piece of art—this almost never happens at a computer. Or if it does, it’s a cobbled-together collection of tools like Google Docs and Zoom which aren’t really made for this critical part of the creative lifecycle.
any given business will find a small number of highly-effective channels, and the rest don't matter. For Heroku, that was attending developer conferences and getting blog posts on Hacker News. For another business it might be YouTube influencer sponsorships and print ads in a niche magazine. So I set about systematically testing many channels.
Even After 40 Years, the Macintosh’s Spirit is the Same
The Mac Turns Forty – Pixel Envy
As for a Hall of Shame thing? That would be the slow but steady encroachment of single-window applications in MacOS, especially via Catalyst and Electron. The reason I gravitated toward MacOS in the first place is the same reason I continue to use it: it fits my mental model of how an operating system ought to work.
Elegy for the Native Mac App
Tracing a trendline from the start of the Mac apps platforms to the future of visionOS
In recent years Sketch’s Mac-ness has become a liability. Requiring every person in a large design organization to use a Mac is not an easy sell. Plus, a new generation of “internet native” users expect different things from their software than old-school Mac connoisseurs: Multiplayer editing, inline commenting, and cloud sync are now table-stakes for any successful creative app.
At the time of Sketch’s launch most UX designers were using Photoshop or Illustrator. Both were expensive and overwrought, and neither were actually created for UX design. Sketch’s innovation wasn’t any particular feature — if anything it was the lack of features. It did a few things really well, and those were exactly the things UX designers wanted. In that way it really embodied the Mac ethos: simple, single-purpose, and fun to use.
Apple pushed hard to attract artists, filmmakers, musicians, and other creative professionals. It started a virtuous cycle. More creatives using Macs meant more potential customers for creative Mac software, which meant more developers started building that software, which in turn attracted even more customers to the platform.And so the Mac ended up with an abundance of improbably-good creative tools. Usually these apps weren’t as feature-rich or powerful as their PC counterparts, but were faster and easier and cheaper and just overall more conducive to the creative process.
Apple is still very interested in selling Macs — precision-milled aluminum computers with custom-designed chips and “XDR” screens. But they no longer care much about The Mac: The operating system, the software platform, its design sensibilities, its unique features, its vibes.
The term-of-art for this style is “skeuomorphism”: modern designs inspired by their antecedents — calculator apps that look like calculators, password-entry fields that look like bank vaults, reminders that look like sticky notes, etc.This skeuomorphic playfulness made downloading a new Mac app delightful. The discomfort of opening a new unfamiliar piece of software was totally offset by the joy of seeing a glossy pixel-perfect rendition of a bookshelf or a bodega or a poker table, complete with surprising little animations.
There are literally dozens of ways to develop cross-platform apps, including Apple’s own Catalyst — but so far, none of these tools can create anything quite as polished as native implementations.So it comes down to user preference: Would you rather have the absolute best app experience, or do you want the ability to use an acceptably-functional app from any of your devices? It seems that users have shifted to prefer the latter.
Unfortunately the appeal of native Mac software was, at its core, driven by brand strategy. Mac users were sold on the idea that they were buying not just a device but an ecosystem, an experience. Apple extended this branding for third-party developers with its yearly Apple Design Awards.
for the first time since the introduction of the original Mac, they’re just computers. Yes, they were technically always “just computers”, but they used to feel like something bigger. Now Macs have become just another way, perhaps the best way, to use Slack or VSCode or Figma or Chrome or Excel.
visionOS’s story diverges from that of the Mac. Apple is no longer a scrappy upstart. Rather, they’re the largest company in the world by market cap. It’s not so much that Apple doesn’t care about indie developers anymore, it’s just that indie developers often end up as the ants crushed beneath Apple’s giant corporate feet.
I think we’ll see a lot of cool indie software for visionOS, but also I think most of it will be small utilities or toys. It takes a lot of effort to build and support apps that people rely on for their productivity or creativity. If even the wildly-popular Mac platform can’t support those kinds of projects anymore, what chance does a luxury headset have?
Michael Tsai - Blog - Elegy for the Native Mac App
Reddit API AMA and User Revolt
good roundup of comments about the Reddit API debacle caused by CEO Steve Huffman
Reddit is rumored to have plans to go public, but they need better leadership than the current team. Huffman has shown no leadership skills. He doesn’t know how to read the room. Most importantly, he lacks the social empathy to lead a social platform. Even more disappointing is the lack of comments or intervention from Reddit co-founder Alexis Ohanian, the always chatty — who seems to have advice for every other founder, except for his co-founder.
[…]
In an attempt to monetize the content generated by the community, Huffman forgot that it is the people who make the platform. The community is the platform. It is something the owners of social media platforms forget.
[…]
It happened with MySpace. It has happened with Twitter. It is now happening with Reddit. They never learn from past mistakes. They assume that because they own the platform, they own the community. Every time they forget that important thing, they erode the community’s trust. And once that trust goes, so does the unfettered loyalty. People start looking for options.
I have zero faith in Steve Huffman’s ability to lead Reddit.
What kind of chief executive officer posts this comment after a massive community backlash?
closing off 3rd party API access mostly serves an IPO, not OpenAI. If Reddit merely wanted to restrict the ability to scrape its data, they could have done so without killing off clients – e.g. via licensing deals. However, perhaps if access to training data is seen as an elbows-out brawl, I could see how Reddit would be extremely protective of its data. I mean, lyrics websites, map makers, and dictionaries go to great lengths to protect their data. It would not be a giant stretch for Reddit to do so as well.
Huffman is right that, in the end, the whole situation reflects a product problem: the native Reddit apps, both on desktop and on mobile, are ugly and difficult to use. (In particular, I find the nested comments under each post bizarrely difficult to expand or collapse; the tap targets for your fingers are microscopic.) Reddit didn’t really navigate the transition to mobile devices so much as it endured it; it’s little wonder that millions of the service’s power users have sought refuge in third-party apps with more modern designs.
Design with SwiftUI - WWDC23 - Videos - Apple Developer
The products that we build contain complex flows and highly interactive elements. As a result, there's so many important decisions that we need to make. SwiftUI helps by quickly surfacing all of those important details that need your attention, for example, how an image should look when it's loading or how a button appears when it's pressed. These are the types of things that make a product feel complete. They're easily hidden in static design tools but are quickly surfaced when working in a dynamic tool like SwiftUI.That's because SwiftUI makes it easy to build your designs on device. In doing this, you gain a more complete understanding of what you're making. Separate parts now interact together, and you can begin to evaluate the experience as a whole. This process quickly reveals what's working in your design and what still needs attention or polish. On Maps, we've found this to be tremendously helpful.
The Shocking State of Enthusiast Apps on Android
Meanwhile, Over in Androidtown
Making Our Hearts Sing
One thing I learned long ago is that people who prioritize design, UI, and UX in the software they prefer can empathize with and understand the choices made by people who prioritize other factors (e.g. raw feature count, or the ability to tinker with their software at the system level, or software being free-of-charge). But it doesn’t work the other way: most people who prioritize other things can’t fathom why anyone cares deeply about design/UI/UX because they don’t perceive it. Thus they chalk up iOS and native Mac-app enthusiasm to being hypnotized by marketing, Pied Piper style.
Those who see and value the artistic value in software and interface design have overwhelmingly wound up on iOS; those who don’t have wound up on Android. Of course there are exceptions. Of course there are iOS users and developers who are envious of Android’s more open nature. Of course there are Android users and developers who do see how crude the UIs are for that platform’s best-of-breed apps. But we’re left with two entirely different ecosystems with entirely different cultural values — nothing like (to re-use my example from yesterday) the Coke-vs.-Pepsi state of affairs in console gaming platforms.
Rebuilding Society on Meaning (Improved version) - YouTube
Arc is the best web browser to come out in the last decade
Folk Interfaces
You can look at an interface and see it as a clearly signposted user journey you should follow. Or you can see it as a collection of functions and affordances to repurpose. As raw material, rather than a guided path.
Instagram, TikTok, and the Three Trends
In other words, when Kylie Jenner posts a petition demanding that Meta “Make Instagram Instagram again”, the honest answer is that changing Instagram is the most Instagram-like behavior possible.
The first trend is the shift towards ever more immersive mediums. Facebook, for example, started with text but exploded with the addition of photos. Instagram started with photos and expanded into video. Gaming was the first to make this progression, and is well into the 3D era. The next step is full immersion — virtual reality — and while the format has yet to penetrate the mainstream this progression in mediums is perhaps the most obvious reason to be bullish about the possibility.
The second trend is the increase in artificial intelligence. I’m using the term colloquially to refer to the overall trend of computers getting smarter and more useful, even if those smarts are a function of simple algorithms, machine learning, or, perhaps someday, something approaching general intelligence.
The third trend is the change in interaction models from user-directed to computer-controlled. The first version of Facebook relied on users clicking on links to visit different profiles; the News Feed changed the interaction model to scrolling. Stories reduced that to tapping, and Reels/TikTok is about swiping. YouTube has gone further than anyone here: Autoplay simply plays the next video without any interaction required at all.
Rewilding your attention
our truly quirky dimensions are never really grasped by these recommendation algorithms. They have all the dullness of a Demographics 101 curriculum; they sketch our personalities with the crudity of crime-scene chalk-outlines. They’re not wrong about us; but they’re woefully incomplete.
The metaphor suggests precisely what to do: If you want to have wilder, curiouser thoughts, you have to avoid the industrial monocropping of big-tech feeds. You want an intellectual forest, overgrown with mushrooms and towering weeds and a massive dead log where a family of raccoons has taken up residence.
For me, it’s meant slowly — over the last few years — building up a big, rangy collection of RSS feeds, that let me check up on hundreds of electic blogs and publications and people. (I use Feedly.) I’ve also started using Fraidycat, a niftily quixotic feed-reader that lets you sort sources into buckets by “how often should I check this source”, which is a cool heuristic; some people/sites you want to check every day, and others, twice a year.
Other times I spend an hour or two simply prospecting — I pick a subject almost at random, then check to see if there’s a hobbyist or interest-group discussion-board devoted to it. (There usually is, running on free warez like phpBB). Then I’ll just trawl through the forum, to find out what does this community care about?
On the Internet, We’re Always Famous - The New Yorker
I’ve come to believe that, in the Internet age, the psychologically destabilizing experience of fame is coming for everyone. Everyone is losing their minds online because the combination of mass fame and mass surveillance increasingly channels our most basic impulses—toward loving and being loved, caring for and being cared for, getting the people we know to laugh at our jokes—into the project of impressing strangers, a project that cannot, by definition, sate our desires but feels close enough to real human connection that we cannot but pursue it in ever more compulsive ways.
It seems distant now, but once upon a time the Internet was going to save us from the menace of TV. Since the late fifties, TV has had a special role, both as the country’s dominant medium, in audience and influence, and as a bête noire for a certain strain of American intellectuals, who view it as the root of all evil. In “Amusing Ourselves to Death,” from 1985, Neil Postman argues that, for its first hundred and fifty years, the U.S. was a culture of readers and writers, and that the print medium—in the form of pamphlets, broadsheets, newspapers, and written speeches and sermons—structured not only public discourse but also modes of thought and the institutions of democracy itself. According to Postman, TV destroyed all that, replacing our written culture with a culture of images that was, in a very literal sense, meaningless. “Americans no longer talk to each other, they entertain each other,” he writes. “They do not exchange ideas; they exchange images. They do not argue with propositions; they argue with good looks, celebrities and commercials.”
What comes after smartphones? — Benedict Evans
Mainframes were followed by PCs, and then the web, and then smartphones. Each of these new models started out looking limited and insignificant, but each of them unlocked a new market that was so much bigger that it pulled in all of the investment, innovation and company creation and so grew to overtake the old one. Meanwhile, the old models didn’t go away, and neither, mostly, did the companies that had been created by them. Mainframes are still a big business and so is IBM; PCs are still a big business and so is Microsoft. But they don’t set the agenda anymore - no-one is afraid of them.
We’ve spent the last few decades getting to the point that we can now give everyone on earth a cheap, reliable, easy-to-use pocket computer with access to a global information network. But so far, though over 4bn people have one of these things, we’ve only just scratched the surface of what we can do with them.
There’s an old saying that the first fifty years of the car industry were about creating car companies and working out what cars should look like, and the second fifty years were about what happened once everyone had a car - they were about McDonalds and Walmart, suburbs and the remaking of the world around the car, for good and of course bad. The innovation in cars became everything around the car. One could suggest the same today about smartphones - now the innovation comes from everything else that happens around them.
LinkedIn’s Alternate Universe - Divinations
Every platform has its royalty. On Instagram it's influencers, foodies, and photographers. Twitter belongs to the founders, journalists, celebrities, and comedians. On LinkedIn, it’s hiring managers, recruiters, and business owners who hold power on the platform and have the ear of the people.
On a job site, they’re the provisioners of positions and never miss the chance to regale their audience with their professional deeds: hiring a teenager with no experience, giving a stressed single mother a chance to provide for her family, or seeing past a candidate’s imperfections to give them a once-in-a-lifetime opportunity. These stories are relayed dramatically in what’s now recognizable as LinkedIn-style storytelling, one spaced sentence at a time, told by job-givers with a savior complex.
Wikipedia Is the Last Best Place on the Internet | WIRED