AI bots hallucinate software packages and devs download them
Not only that but someone, having spotted this reoccurring hallucination, had turned that made-up dependency into a real one, which was subsequently downloaded and installed thousands of times by developers as a result of the AI's bad advice, we've learned. If the package was laced with actual malware, rather than being a benign test, the results could have been disastrous.
Lass Security's recent research on AI Package Hallucinations extends the attack technique to GPT-3.5-Turbo, GPT-4, Gemini Pro (Bard), and Coral (Cohere).
Untruths spouted by chatbots ended up on the web—and Microsoft's Bing search engine served them up as facts. Generative AI could make search harder to trust.