
"The repository reached the #1 trending position on Hugging Face within 18 hours, highlighting how public AI repositories are becoming a new software supply chain attack vector. A malicious Hugging Face repository posing as an OpenAI release delivered infostealer malware to Windows systems and logged 244,000 downloads before being removed, raising fresh concerns about how enterprises source and validate AI models from public repositories."
"The repository, named Open-OSS/privacy-filter, impersonated OpenAI's legitimate Privacy Filter release, copied its model card almost word-for-word, and included a malicious loader.py file that fetched and executed credential-stealing malware on Windows hosts, AI security firm HiddenLayer said in a research advisory. "The repository reached the #1 trending position on Hugging Face with approximately 244K downloads and 667 likes in under 18 hours, numbers that were almost certainly artificially inflated to make the repository appear legitimate," the advisory added."
"The README accompanying the fake model diverged from the legitimate project in one key area, instructing users to run start.bat on Windows or execute python loader.py on Linux and macOS. Researchers have previously found malicious code hidden inside Pickle-serialised model files on Hugging Face that bypassed the platform's scanners. They have also warned that the AI supply chain is lagging behind traditional software in oversight and tooling."
A malicious Hugging Face repository impersonated an OpenAI Privacy Filter release and copied its model card nearly word-for-word. The repository reached #1 trending within 18 hours and accumulated about 244,000 downloads and 667 likes, with numbers likely inflated to appear legitimate. It included a malicious loader.py file that fetched and executed credential-stealing malware on Windows hosts. The README instructed users to run start.bat on Windows or execute python loader.py on Linux and macOS, diverging from the legitimate project. Prior research has found malicious code hidden in Pickle-serialized model files that can bypass platform scanners, and warned that AI supply-chain oversight lags behind traditional software security tooling.
Read at InfoWorld
Unable to calculate read time
Collection
[
|
...
]