AI Sucks

AI Sucks

65 bookmarks
Custom sorting
I need AI
I need AI
I need AI to waste energy. I need it to deprive vulnerable communities of water so that it can be used to cool new data centers. I need AI to make up answers to my questions.
·coryd.dev·
I need AI
The Internet Is Full of AI Dogshit - Aftermath
The Internet Is Full of AI Dogshit - Aftermath
The Internet used to be so simple to use that people collectively coined the term “let me Google that for you” to make fun of people who had the audacity of asking other people questions online. In the future I fear that people will have no other choice but to ask people for information from the Internet, because right now it’s all full of AI dogshit.
The people who hold the purse strings for Sports Illustrated are more interested in gaming Google search results and the resultant ad revenue from that practice than actually serving their readers.
·aftermath.site·
The Internet Is Full of AI Dogshit - Aftermath
Google Researchers’ Attack Prompts ChatGPT to Reveal Its Training Data
Google Researchers’ Attack Prompts ChatGPT to Reveal Its Training Data
ChatGPT is full of sensitive private information and spits out verbatim text from CNN, Goodreads, WordPress blogs, fandom wikis, Terms of Service agreements, Stack Overflow source code, Wikipedia pages, news blogs, random internet comments, and much more.
This paper should serve as yet another reminder that the world’s most important and most valuable AI company has been built on the backs of the collective work of humanity, often without permission, and without compensation to those who created it.
·404media.co·
Google Researchers’ Attack Prompts ChatGPT to Reveal Its Training Data
Losing the imitation game
Losing the imitation game
AI cannot develop software for you, but that's not going to stop people from trying to make it happen anyway. And that is going to turn all of the easy software development problems into hard problems.
The relationships between these tokens span a large number of parameters. In fact, that's much of what's being referenced when we call a model large. Those parameters represent grammar rules, stylistic patterns, and literally millions of other things.What those parameters don't represent is anything like knowledge or understanding. That's just not what LLMs do. The model doesn't know what those tokens mean. I want to say it only knows how they're used, but even that is over stating the case, because it doesn't know things. It models how those tokens are used.
The fundamental task of software development is not writing out the syntax that will execute a program. The task is to build a mental model of that complex system, make sense of it, and manage it over time.
_Writing_ code was never the problem. Reading it, understanding it, and knowing how to change it are the problems. All the LLMs have done is automate away the easy part and turn it into the hard part.
The hard part of programming is building and maintaining a useful mental model of a complex system. The easy part is writing code. They're positioning this tool as a universal solution, but it's only capable of doing the easy part. And even then, it's not able to do that part reliably. Human engineers will still have to evaluate and review the code that an AI writes. But they'll now have to do it without the benefit of having anyone who understands it. No one can explain it. No one can explain what they were thinking when they wrote it.
Moderating the output of these models depends on armies of low paid and precariously employed human reviewers, mostly in Kenya. They're subjected to the raw, unfiltered linguistic sewage that is the result of training a language model on uncurated text found on the public internet. If ChatGPT doesn't wantonly repeat the very worst of the things you can find on reddit, 4chan, or kiwi farms, that is because it's being dumped on Kenyan gig workers instead.
·jenniferplusplus.com·
Losing the imitation game