The GOP's Blisteringly Hypocritical Road From Whining About Net Neutrality To Supporting Trump's Idiotic Attack On Social Media
37 Easy Ways to Spice Up Your UI Designs
Thoughtless
Art + tech
What is a feed? (a.k.a. RSS) Explanation
Tailwind CSS Tutorial and Examples for Beginners | Codete Blog
Data-Driven Design is Killing Our Instincts
It creates more generic-looking interfaces that may perform well in numbers but fall short of appealing to our senses.
It’s easy to make data-driven design decisions, but relying on data alone ignores that some goals are difficult to measure. Data is very useful for incremental, tactical changes, but only if it’s checked and balanced by our instincts and common sense.
It became clear to the team in that moment that we cared about more than just clicks. We had other goals for this design: It needed to set expectations about what happens next, it needed to communicate quality, and we wanted it to build familiarity and trust in our brand.We could have easily measured how many customers clicked one button versus another, and used that data to pick an optimal button. But that approach would have ignored the big picture and other important goals.
Not everything that can be counted counts. Not everything that counts can be counted.Data is good at measuring things that are easy to measure. Some goals are less tangible, but that doesn’t make them less important.While you’re chasing a 2% increase in conversion rate you may be suffering a 10% decrease in brand trustworthiness. You’ve optimized for something that’s objectively measured, at the cost of goals that aren’t so easily codified.
Design instinct is a lot more than innate creative ability and cultural guesswork. It’s your wealth of experience. It’s familiarity with industry standards and best practices.
Overreliance on data to drive design decisions can be just as harmful as ignoring it. Data only tells one kind of story. But your project goals are often more complex than that. Goals can’t always be objectively measured.
The vanishing designer
Ventusky - Detailed Weather Forecast Maps
Giving a Shit as a Service
Critical Attrition | The Editors
The main problem with the book review today is not that its practitioners live in New York, as some contend. It is not that the critics are in cahoots with the authors under review, embroiled in a shadow economy of social obligation and quid pro quo favor trading. The problem is not that book reviews are too mean or too nice, too long or too short, though they may be those things, too. The main problem is that the contemporary American book review is first and foremost an audition — for another job, another opportunity, another day in the content mine, hopefully with better lighting and tools, but at the very least with better pay. What kind of job or opportunity for the reviewer depends on her ambitions.
He wants honest reviews of novels; instead he gets hype and a dizzying, outrageous, stultifying profusion of adjectives. He wants serious literary criticism of novels; instead he gets withering assessments of the era in which said novels are written, which, by endlessly discussing the same five novels, only confirm his fears that literature has reached the unsustainably small gene-pool era of its long, slow slide to extinction. The contemporary reviewer is unhappy too: she works too hard, and still everything she does is wrong and insufficiently compensated. Her careful reviews end up reading like stenography, and when she swings for the fences her actual readers — unlike the trigger-happy tweeters — complain that she has swung too far.
Kevin Kelly on Why Technology Has a Will
The game is that every time we create a new technology, we’re creating new possibilities, new choices that didn’t exist before. Those choices themselves—even the choice to do harm—are a good, they’re a plus.
We want an economy that’s growing in the second sense: unlimited betterment, unlimited increase in wisdom, and complexity, and choices. I don’t see any limit there. We don’t want an economy that’s just getting fatter and fatter, and bigger and bigger, in terms of its size. Can we imagine such a system? That’s hard, but I don’t think it’s impossible.
From social skills to sleep patterns: all the self-help advice that actually works
in terms of innovation and economic output, the people in these regions are about eight times more productive than the average person.
These regions in 2008 were: (1) Greater Tokyo (2) Boston-Washington corridor (3) Chicago to Pittsburgh (4) Amsterdam-Brussels-Antwerp (5) Osaka-Nagoya (6) London and South East England (7) Milan to Turin (8) Charlotte to Atlanta (9) Southern California (LA to San Diego) (10) Frankfurt to Mannheim. Silicon Valley, Paris, Berlin, and Denver-Boulder also deserve a mention as having some of the highest rates of innovation per person.
On the Social Media Ideology
Social networking is much more than just a dominant discourse. We need to go beyond text and images and include its software, interfaces, and networks that depend on a technical infrastructure consisting of offices and their consultants and cleaners, cables and data centers, working in close concert with the movements and habits of the connected billions. Academic internet studies circles have shifted their attention from utopian promises, impulses, and critiques to “mapping” the network’s impact. From digital humanities to data science we see a shift in network-oriented inquiry from Whether and Why, What and Who, to (merely) How. From a sociality of causes to a sociality of net effects. A new generation of humanistic researchers is lured into the “big data” trap, and kept busy capturing user behavior whilst producing seductive eye candy for an image-hungry audience (and vice versa).
We need to politicize the New Electricity, the privately owned utilities of our century, before they disappear into the background.
What remains particularly unexplained is the apparent paradox between the hyper-individualized subject and the herd mentality of the social.
Before we enter the social media sphere, everyone first fills out a profile and choses a username and password in order to create an account. Minutes later, you’re part of the game and you start sharing, creating, playing, as if it has always been like that. The profile is the a priori part and the profiling and targeted advertising cannot operate without it. The platforms present themselves as self-evident. They just are—facilitating our feature-rich lives. Everyone that counts is there. It is through the gate of the profile that we become its subject.
We pull in updates, 24/7, in a real-time global economy of interdependencies, having been taught to read news feeds as interpersonal indicators of the planetary condition
Treating social media as ideology means observing how it binds together media, culture, and identity into an ever-growing cultural performance (and related “cultural studies”) of gender, lifestyle, fashion, brands, celebrity, and news from radio, television, magazines, and the web—all of this imbricated with the entrepreneurial values of venture capital and start-up culture, with their underside of declining livelihoods and growing inequality.
Software, or perhaps more precisely operating systems, offer us an imaginary relationship to our hardware: they do not represent transistors but rather desktops and recycling bins. Software produces users. Without operating system (OS) there would be no access to hardware; without OS no actions, no practices, and thus no user. Each OS, through its advertisements, interpellates a “user”: calls it and offers it a name or image with which to identify.
We could say that social media performs the same function, and is even more powerful.
In the age of social media we seem to confess less what we think. It’s considered too risky, too private. We share what we do, and see, in a staged manner. Yes, we share judgments and opinions, but no thoughts. Our Self is too busy for that, always on the move, flexible, open, sporty, sexy, and always ready to connect and express.
Platforms are not stages; they bring together and synthesize (multimedia) data, yes, but what is lacking here is the (curatorial) element of human labor. That’s why there is no media in social media. The platforms operate because of their software, automated procedures, algorithms, and filters, not because of their large staff of editors and designers. Their lack of employees is what makes current debates in terms of racism, anti-Semitism, and jihadism so timely, as social media platforms are currently forced by politicians to employ editors who will have to do the all-too-human monitoring work (filtering out ancient ideologies that refuse to disappear).
The international guide to gender-inclusive writing • UX Content Collective
How Technocrats Triumphed at Apple
Simultaneous Translation in HTML
How Can We Make Presentations Better? – iA
Social Mistakes Intellectual People Can Make | www.succeedsocially.com
Social Skills Guide
His PTSD, and My Struggle to Live With It
America’s unending horizon of mass shootings
MDLIVE Telehealth - Dashboard
Deep Laziness
Imagine a person who is very lazy at work, yet whose customers are (along with everyone else concerned) quite satisfied. It could be a slow-talking rural shop proprietor from an old movie, or some kind of Taoist fisherman – perhaps a bit of a buffoon, but definitely deeply content. In order to be this way, he must be reasonably organized: stock must be ordered, and tackle squared away, in order to afford worry-free, deep-breathing laziness.
Consider this imaginary person as a kind of ideal or archetype. Now consider that the universe might have this personality.
Info — Pomp&Clout
Ryan Staake
One startup's quest to take on Chrome and reinvent the web browser
Miller is the CEO of a new startup called The Browser Company, and he wants to change the way people think about browsers altogether. He sees browsers as operating systems, and likes to wonder aloud what "iOS for the web" might look like. What if your browser could build you a personalized news feed because it knows the sites you go to? What if every web app felt like a native app, and the browser itself was just the app launcher? What if you could drag a file from one tab to another, and it just worked? What if the web browser was a shareable, synced, multiplayer experience?
Miller became convinced that the next big platform was right in front of his face: the open web. The underlying infrastructure worked, the apps were great, there were no tech giants in the way imposing rules and extracting huge commissions. The only thing missing was a tool to bring it all together in a user-friendly way, and make the web more than the sum of its parts.
Browser's team instead spent its time thinking about how to solve things like tab overload, that all-too-familiar feeling of not being able to find anything in a sea of tiny icons at the top of the screen.That's something Nate Parrott, a designer on the team, had been thinking about for a long time. "Before I met Josh," he said, "I had this fascination with browsers, because it's the window through which you experience so much of the web, and yet it feels like no one is working on web browsers." Outside of his day job at Snap, he was also building a web browser with some new interaction ideas. "A big one for me was that I wanted to get rid of the distinction between open and closed tabs," he said. "I wanted to encourage tab-hoarding behavior, where you can open as many tabs as you want and organize them so you're not constantly overwhelmed seeing them all at the same time."
One of Arc's most immediately noticeable features is that it combines bookmarks and tabs. Clicking an icon in the sidebar opens the app, just like on iOS or Android. When users navigate somewhere else, they don't have to close the tab; it just waits in the background until it's needed again, and Arc manages its background performance so it doesn't use too much memory. Instead of opening Gmail in a tab, users just … open Gmail.
Everyone at The Browser Company swears there's no Master Plan, or much of a roadmap. What they have is a lot of ideas, a base on which they can develop really quickly, and a deep affinity for prototypes. "You can't just think really hard and design the best web browser," Parrott said. "You have to feel it and put it in front of people and get them to react to it."
The Browser Company could become an R&D shop, full of interesting ideas but unable to build a browser that anyone actually uses. The company does have plenty of runway: It recently raised more than $13 million in funding from investors including Jeff Weiner, Eric Yuan, Patrick Collison, Fidji Simo and a number of other people with long experience building for the internet, that values The Browser Company at $100 million. Still, Agrawal said, "We're paranoid that we could end up in this world of just having a Bell Labs kind of situation, where you have a lot of interesting stuff, but it's not monetizable, it's not sticky, any of that." That's why they're religious about talking to users all the time, getting feedback on everything, making sure that the stuff they're building is genuinely useful. And when it's not, they pivot fast.
merely
Folk (Browser) Interfaces
For the layman to build their own Folk Interfaces, jigs to wield the media they care about, we must offer simple primitives. A designer in Blender thinks in terms of lighting, camera movements, and materials. An editor in Premiere, in sequences, transitions, titles, and colors. Critically, this is different from automating existing patterns, e.g. making it easy to create a website, simulate the visuals of film photography, or 3D-scan one's room. Instead, it's about building a playground in which those novel computational artifacts can be tinkered with and composed, via a grammar native to their own domain, to produce the fruits of the users' own vision.
The goal of the computational tool-maker then is not to teach the layman about recursion, abstraction, or composition, but to provide meaningful primitives (i.e. a system) with which the user can do real work. End-user programming is a red herring: We need to focus on materiality, what some disparage as mere "side effects." The goal is to enable others to feel the agency and power that comes when the world ceases to be immutable.
This feels strongly related to another quote about software as ideology / a system of metaphors that influence the way we assign value to digital actions and content.
I hope this mode can
paint the picture of software, not as a teleological instrument careening
towards automation and ease, but as a medium for intimacy with the matter of
our time (images, audio, video), yielding a sense of agency with what, to most,
feels like an indelible substrate.
Ask HN: Is there a site popular with Gen Z where users can write HTML and CSS? | Hacker News