
"Slopsquatting is an attack method in which hackers exploit common AI hallucinations to trick engineers into mistakenly installing malicious packages. In short, hackers track non-existent packages hallucinated by AI coding tools and then publish malicious packages under these names on public repositories such as . The seemingly legitimate packages are then installed by victims who trust their AI code suggestions."
"I would say an old school human coder, especially in the open source world where I've spent my entire career, every line of code was typically reviewed and approved by a human maintainer. And now the ability of an AI to generate literally gigabytes of code, millions, if not billions of lines of code, starts to put this out of the reach of even some of the most prolific maintainers."
Slopsquatting leverages AI-generated hallucinations by monitoring and registering package names that coding assistants invent, then publishing malicious packages under those names to public repositories. Developers who accept AI suggestions can unintentionally install these seemingly legitimate packages, introducing malware and supply-chain risks. The technique adapts long-standing typosquatting tactics to the AI era and exploits easy package registration in ecosystems like Python and Java. Rapid, large-scale AI code generation reduces human review, enabling malicious packages to proliferate and increasing the potential impact on business-critical systems and open source software maintenance.
Read at IT Pro
Unable to calculate read time
Collection
[
|
...
]