Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Allianz Life Confirms Major Data Breach: Hackers Access Personal Data of Customers

    July 26, 2025

    Samsung’s AI Strategy for Galaxy S26 Could Include OpenAI, Perplexity, and More

    July 26, 2025

    Apple Overhauls App Store Age Ratings with New Tiers and Child Safety Enhancements

    July 25, 2025
    Facebook X (Twitter) Instagram Pinterest
    EchoCraft AIEchoCraft AI
    • Home
    • AI
    • Apps
    • Smart Phone
    • Computers
    • Gadgets
    • Live Updates
    • About Us
      • About Us
      • Privacy Policy
      • Terms & Conditions
    • Contact Us
    EchoCraft AIEchoCraft AI
    Home»AI»ReversingLabs Report: Malicious Machine Learning Models Bypass Security on Hugging Face
    AI

    ReversingLabs Report: Malicious Machine Learning Models Bypass Security on Hugging Face

    EchoCraft AIBy EchoCraft AIFebruary 10, 2025No Comments3 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Hugging Face, the widely used platform for hosting AI and machine learning models, recently faced cybersecurity concerns after researchers discovered malicious models embedded with malware.

    A report by cybersecurity firm ReversingLabs highlighted that attackers exploited vulnerabilities to distribute harmful code, potentially affecting developers and organizations relying on open-source AI solutions.

    Exploitation via Pickle File Serialization

    According to ReversingLabs, the malicious models leveraged Pickle file serialization, a method that enables Python code execution during model loading.

    While this serialization technique is efficient, it is often flagged as insecure because it allows arbitrary code execution, making it susceptible to misuse in open-source environments.

    Hugging Face, as an open-source platform, provides broad access to ML models, which attackers exploited to distribute malware-laden files that initially evaded detection by the platform’s security tools.

    How the Attack Worked

    The exploit involved compressing models using the 7z format, which interfered with Hugging Face’s Picklescan security tool.

    Typically, models stored in the PyTorch format use ZIP compression, allowing Picklescan to scan for harmful content. However, the use of the less common 7z format rendered these scans ineffective.

    ReversingLabs described the exploitation technique as “nullifAI,” emphasizing its ability to bypass existing security protocols. The cybersecurity firm warned that developers who downloaded the compromised models might have unknowingly introduced malware into their systems.

    Technical Observations

    • Bypassing Security Tools: The compressed files in PyTorch format prevented Picklescan from detecting malicious payloads.
    • Broken Pickle Files: These files contained “broken” serialization data, halting processes shortly after execution, which allowed malicious functions to bypass security checks.
    • Malicious Payloads: The embedded code reportedly enabled reverse shell connections, allowing attackers to establish communication with affected systems and execute commands remotely.

    Hugging Face’s Response

    Upon receiving a report from ReversingLabs on January 20, Hugging Face acted swiftly to address the issue.

    The malicious models were removed within 24 hours. Additionally, the platform updated its Picklescan tool to better identify threats, including those present in compressed or broken Pickle files.

    Security Recommendations

    This incident highlights the challenges of maintaining security on collaborative open-source platforms. ReversingLabs urged developers to be cautious when downloading third-party models and to consider safer practices,

    • Avoiding reliance on inherently insecure serialization formats like Pickle.
    • Keeping custom loading functions separate from serialized model data.
    • Regularly reviewing and documenting loading procedures.

    Frequently Asked Questions

    What is Pickle file serialization and why is it risky?

    Pickle file serialization is a Python method for converting objects into byte streams. Its risk lies in its ability to execute arbitrary code during deserialization, which attackers can exploit by embedding malicious payloads.

    How did attackers bypass Hugging Face’s security tools?

    Attackers exploited vulnerabilities by compressing ML models with the 7z format instead of the typical ZIP compression. This unconventional format prevented Picklescan from detecting harmful code embedded within the models.

    What measures has Hugging Face taken to secure the platform?

    Once the malicious models were discovered, Hugging Face swiftly removed them and updated their Picklescan tool to better detect threats, including those hidden in compressed or broken Pickle files.

    What precautions should developers take when using third-party ML models?

    Developers are advised to verify the integrity and provenance of models, avoid relying solely on Pickle serialization, implement custom loading functions with strict validation, and monitor for any unusual behavior when integrating third-party models.

    What is a reverse shell and why is it significant here?

    A reverse shell is a technique that allows an attacker to gain remote access to a compromised system by initiating an outbound connection. In this context, it enabled attackers to execute commands on affected systems, emphasizing the severe security risks involved.

    AI AI safety Hugging Face
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleGemini with Imagen 3: Google Workspace to Reintroduce Image Generation of People
    Next Article OpenAI Developing First Custom AI Chipset Using TSMC 3nm Process to Reduce Nvidia Reliance
    EchoCraft AI

    Related Posts

    AI

    Samsung’s AI Strategy for Galaxy S26 Could Include OpenAI, Perplexity, and More

    July 26, 2025
    AI

    Google Tests Opal: An AI-Powered App Builder for the No-Code Generation

    July 25, 2025
    AI

    Google Launches ‘Web Guide’: AI-Powered Search Tool That Organizes Results by Context

    July 25, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Search
    Top Posts

    Samsung Galaxy S25 Rumours of A New Face in 2025

    March 19, 2024378 Views

    CapCut Ends Free Cloud Storage, Introduces Paid Plans Starting August 5

    July 12, 2024237 Views

    6G technology The Future of Innovation for 2024

    February 24, 2024225 Views
    Categories
    • AI
    • Apps
    • Computers
    • Gadgets
    • Gaming
    • Innovations
    • Live Updates
    • Science
    • Smart Phone
    • Social Media
    • Tech News
    • Uncategorized
    Latest in AI
    AI

    Samsung’s AI Strategy for Galaxy S26 Could Include OpenAI, Perplexity, and More

    EchoCraft AIJuly 26, 2025
    AI

    Google Tests Opal: An AI-Powered App Builder for the No-Code Generation

    EchoCraft AIJuly 25, 2025
    AI

    Google Launches ‘Web Guide’: AI-Powered Search Tool That Organizes Results by Context

    EchoCraft AIJuly 25, 2025
    AI

    GitHub Launches Spark: AI App Creation Tool with Built-in Collaboration

    EchoCraft AIJuly 24, 2025
    AI

    Google Rolls Out Personalized AI-Powered Virtual Try-On for Shopping

    EchoCraft AIJuly 24, 2025

    Subscribe to Updates

    Get the latest tech news from FooBar about tech, design and biz.

    Stay In Touch
    • Facebook
    • YouTube
    • Twitter
    • Instagram
    • Pinterest
    Tags
    2024 Adobe AI AI agents AI Model AI safety Amazon android Anthropic apple Apple Intelligence Apps ChatGPT Claude AI Copilot Cyberattack Elon Musk Gaming Gemini Generative Ai Google Grok AI India Innovation Instagram IOS iphone Meta Meta AI Microsoft NVIDIA Open-Source AI OpenAI PC Reasoning Model Robotics Samsung Smartphones Smart phones Social Media U.S whatsapp xAI Xiaomi YouTube
    Most Popular

    Samsung Galaxy S25 Rumours of A New Face in 2025

    March 19, 2024378 Views

    Insightful iQoo Z9 Turbo with New Changes in 2024

    March 16, 2024214 Views

    Apple A18 Pro Impressive Leap in Performance

    April 16, 2024165 Views
    Our Picks

    Apple Previews Major Accessibility Upgrades, Explores Brain-Computer Interface Integration

    May 13, 2025

    Apple Advances Custom Chip Development for Smart Glasses, Macs, and AI Systems

    May 9, 2025

    Cloud Veterans Launch ConfigHub to Address Configuration Challenges

    March 26, 2025

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    Facebook X (Twitter) Instagram Pinterest
    • Home
    • Contact Us
    • Privacy Policy
    • Terms & Conditions
    • About Us
    © 2025 EchoCraft AI. All Right Reserved

    Type above and press Enter to search. Press Esc to cancel.

    Manage Consent
    To provide the best experiences, we use technologies like cookies to store and/or access device information. Consenting to these technologies will allow us to process data such as browsing behavior or unique IDs on this site. Not consenting or withdrawing consent, may adversely affect certain features and functions.
    Functional Always active
    The technical storage or access is strictly necessary for the legitimate purpose of enabling the use of a specific service explicitly requested by the subscriber or user, or for the sole purpose of carrying out the transmission of a communication over an electronic communications network.
    Preferences
    The technical storage or access is necessary for the legitimate purpose of storing preferences that are not requested by the subscriber or user.
    Statistics
    The technical storage or access that is used exclusively for statistical purposes. The technical storage or access that is used exclusively for anonymous statistical purposes. Without a subpoena, voluntary compliance on the part of your Internet Service Provider, or additional records from a third party, information stored or retrieved for this purpose alone cannot usually be used to identify you.
    Marketing
    The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes.
    Manage options Manage services Manage {vendor_count} vendors Read more about these purposes
    View preferences
    {title} {title} {title}