<img height="1" width="1" style="display: none" alt="" src="https://px.ads.linkedin.com/collect/?pid=1098858&amp;fmt=gif">

AI Coding Tools Create New 'Slopsquatting' Security Threat

AI coding assistants often suggest non-existent software packages, posing security risks. Learn how this affects development.
Content Team

Researchers from three US universities discovered that AI coding assistants frequently recommend fake software packages that don't exist - a problem they've dubbed "slopsquatting." Hackers can exploit this by creating malicious packages with these hallucinated names, tricking developers into downloading compromised code.

The study tested 16 popular AI models and found none were immune. Commercial models hallucinated packages 5.2% of the time, while open-source models hit 21.7%. Out of 2.23 million generated packages, nearly 20% were fake.

This creates a dangerous supply chain attack where malicious code could infect entire software projects. The researchers suggest better prompt engineering and model training to address the issue.

Source: Security Week

Share this article
Share on facebook Share on linkedin Share on twitter Share on email
blog_book_a_demo_cta_3x
Have questions about protecting your software?
Our escrow experts are standing by to help.
Book a free demo