The Vogue Archive - Google Arts & Culture
Ego, Fear and Money: How the A.I. Fuse Was Lit
Why Did I Leave Google Or, Why Did I Stay So Long? - LinkedIn
If I had to summarize it, I would say that the signal to noise ratio is what wore me down. We start companies to build products that serve people, not to sit in meetings with lawyers. You need to be able to answer the "what have I done for our users today" question with "not much but I got promoted" and be happy with that answer to be successful in Corp-Tech.
being part of a Corporation means that the signal to noise ratio changes dramatically. The amount of time and effort spent on Legal, Policy, Privacy - on features that have not shipped to users yet, meant a significant waste of resources and focus. After the acquisition, we have an extremely long project that consumed many of our best engineers to align our data retention policies and tools to Google. I am not saying this is not important BUT this had zero value to our users. An ever increasing percent of our time went to non user value creation tasks and that changes the DNA of the company quickly, from customer focused to corporate guidelines focused.
the salaries are so high and the options so valuable that it creates many misalignments. The impact of an individual product on the Corp-Tech stock is minimal so equity is basically free money. Regardless of your performance (individually) or your product performance, you equity grows significantly so nothing you do has real economic impact on your family. The only control you have to increase your economic returns are whether you get promoted, since that drives your equity and salary payments. This breaks the traditional tech model of risk reward.
What I learned getting acquired by Google
While there were undoubtedly people who came in for the food, worked 3 hours a day, and enjoyed their early retirements, all the people I met were earnest, hard-working, and wanted to do great work.
What beat them down were the gauntlet of reviews, the frequent re-orgs, the institutional scar tissue from past failures, and the complexity of doing even simple things on the world stage. Startups can afford to ignore many concerns, Googlers rarely can.
What also got in the way were the people themselves - all the smart people who could argue against anything but not for something, all the leaders who lacked the courage to speak the uncomfortable truth, and all the people that were hired without a clear project to work on, but must still be retained through promotion-worthy made-up work.
Another blocker to progress that I saw up close was the imbalance of a top heavy team. A team with multiple successful co-founders and 10-20 year Google veterans might sound like a recipe for great things, but it’s also a recipe for gridlock.
This structure might work if there are multiple areas to explore, clear goals, and strong autonomy to pursue those paths.
Good teams regularly pay down debt by cleaning things up on quieter days.
Just as real is process debt. A review added because of a launch gone wrong. A new legal check to guard against possible litigation. A section added to a document template. Layers accumulate over the years until you end up unable to release a new feature for months after it's ready because it's stuck between reviews, with an unclear path out.
Fake It ’Til You Fake It
On the long history of photo manipulation dating back to the origins of photography. While new technologies have made manipulation much easier, the core questions around trust and authenticity remain the same and have been asked for over a century.
The criticisms I have been seeing about the features of the Pixel 8, however, feel like we are only repeating the kinds of fears of nearly two hundred years. We have not been able to wholly trust photographs pretty much since they were invented. The only things which have changed in that time are the ease with which the manipulations can happen, and their availability.
We all live with a growing sense that everything around us is fraudulent. It is striking to me how these tools have been introduced as confidence in institutions has declined. It feels like a death spiral of trust — not only are we expected to separate facts from their potentially misleading context, we increasingly feel doubtful that any experts are able to help us, yet we keep inventing new ways to distort reality.
The questions that are being asked of the Pixel 8’s image manipulation capabilities are good and necessary because there are real ethical implications. But I think they need to be more fully contextualized. There is a long trail of exactly the same concerns and, to avoid repeating ourselves yet again, we should be asking these questions with that history in mind. This era feels different. I think we should be asking more precisely why that is.
The questions we ask about generative technologies should acknowledge that we already have plenty of ways to lie, and that lots of the information we see is suspect. That does not mean we should not believe anything, but it does mean we ought to be asking questions about what is changed when tools like these become more widespread and easier to use.
How Google Docs Proved the Power of Less | WIRED