AI Coding Tools Create New 'Slopsquatting' Security Threat
Want more insights like this?
Researchers from three US universities discovered that AI coding assistants frequently recommend fake software packages that don't exist - a problem they've dubbed "slopsquatting." Hackers can exploit this by creating malicious packages with these hallucinated names, tricking developers into downloading compromised code.
The study tested 16 popular AI models and found none were immune. Commercial models hallucinated packages 5.2% of the time, while open-source models hit 21.7%. Out of 2.23 million generated packages, nearly 20% were fake.
This creates a dangerous supply chain attack where malicious code could infect entire software projects. The researchers suggest better prompt engineering and model training to address the issue.
Source: Security Week