Found 5 bookmarks
Newest
It’s Time to Stop Taking Sam Altman at His Word
It’s Time to Stop Taking Sam Altman at His Word
Understand AI for what it is, not what it might become.
The technologies never quite work out like the Altmans of the world promise, but the stories keep regulators and regular people sidelined while the entrepreneurs, engineers, and investors build empires.
We’re in a race to the bottom that everyone saw coming and no one is happy with. Meanwhile, the search for product-market fit at a scale that would justify all the inflated tech-company valuations keeps coming up short. Even OpenAI’s latest release, o1, was accompanied by a caveat from Altman that “it still seems more impressive on first use than it does after you spend more time with it.”
The project of techno-optimism, for decades now, has been to insist that if we just have faith in technological progress and free the inventors and investors from pesky regulations such as copyright law and deceptive marketing, then the marketplace will work its magic and everyone will be better off.
Altman’s entire job is to keep us all fixated on an imagined AI future so we don’t get too caught up in the underwhelming details of the present
It’s Time to Stop Taking Sam Altman at His Word
A Camera, Not an Engine
A Camera, Not an Engine
Modern AI puts us firmly into an age of exploration of computational reality
But why stop with datasets that induce languages with “grammars” that can be rendered legible to us? Could you make a “Large Solar Flares and Sunspots Model” (LSFASM) and learn to talk to the Sun and ask it where it might flare up next? How about a Large Oceanic Model that allows ships to talk to ocean currents? Or a Large History Model that works as a Prime Radiant for Asimovian psychohistory? Maybe a Large Climate Model constructed out of weather data can talk to us and supply strategies for climate change?
One reason it is hard is, once again, our tendency to mistake discoveries for inventions, or equivalently, cameras for engines. Instruments of discovery measure more than they are measured. Yes, there are a number of ways you can measure a telescope (mirror diameter or focal length for example), but the interesting measuring going on is what the telescope is doing to what it’s turned towards (the analogy to AI here is perhaps to things like floating-point precision — that’s closer to mirror diameter).
A Camera, Not an Engine