Click here to visit Original posting
- GenAI can hallucinate open source package names, experts warn
- It doesn't always hallucinate a different name
- Cybercriminals can use the names to register malware
Security researchers have warned of a new method by which Generative AI (GenAI) can be abused in cybercrime, known as 'slopsquatting'.
It starts with the fact that different GenAI tools, such as Chat-GPT, Copilot, and others, hallucinate. In the context of AI, “hallucination” is when the AI simply makes things up. It can make up a quote that a person never said, an event that never happened, or - in software development - an open-source software package that was never created.
Now, according to Sarah Gooding from Socket, many software developers rely heavily on GenAI when writing code. The tool could write the lines itself, or it could suggest the developer different packages to download and include in the product.
Monitor your credit score with TransUnion starting at $29.95/month
TransUnion is a credit monitoring service that helps you stay on top of your financial health. With real-time alerts, credit score tracking, and identity theft protection, it ensures you never miss important changes. You'll benefit from a customizable online interface with clear insights into your credit profile. Businesses also benefit from TransUnion’s advanced risk assessment tools.
Preferred partner (What does this mean?)View Deal
Hallucinating malware
The report adds the AI doesn’t always hallucinate a different name or a different package - some things repeat.
“When re-running the same hallucination-triggering prompt ten times, 43% of hallucinated packages were repeated every time, while 39% never reappeared at all,” it says.
“Overall, 58% of hallucinated packages were repeated more than once across ten runs, indicating that a majority of hallucinations are not just random noise, but repeatable artifacts of how the models respond to certain prompts.”
This is purely theoretical at this point, but apparently, cybercriminals could map out the different packages AI is hallucinating and - register them on open-source platforms.
Therefore, when a developer gets a suggestion and visits GitHub, PyPI, or similar - they will find the package and happily install it, without knowing that it’s malicious.
Luckily enough, there are no confirmed cases of slopsquatting in the wild at press time, but it’s safe to say it is only a matter of time. Given that the hallucinated names can be mapped out, we can assume security researchers will discover them eventually.
The best way to protect against these attacks is to be careful when accepting suggestions from anyone, living or otherwise.
You might also like
- What are AI Hallucinations? When AI goes wrong
- Take a look at our guide to the best authenticator app
- We've rounded up the best password managers