News from the AI & ML world

DeeperML

Pierluigi Paganini@Security Affairs //
Researchers have uncovered a new attack technique targeting AI code editors like GitHub Copilot and Cursor. Dubbed the "Rules File Backdoor," this method allows attackers to inject malicious code into AI-generated code, leading to silent compromise through a supply chain vulnerability. By manipulating the rules files that guide AI coding assistants, hackers can circumvent security checks and generate code that exposes sensitive information.

This involves embedding crafted prompts within seemingly benign rule files, causing the AI tool to generate code containing vulnerabilities or backdoors. The attackers can also use zero-width joiners, bidirectional text markers, and other invisible characters to conceal malicious instructions, tricking the AI into overriding ethical and safety constraints. Successful exploitation could expose database credentials, API keys, and other sensitive details.

GitHub and Cursor have emphasized that users are responsible for reviewing AI-generated code. Experts urge developers to carefully evaluate rules files for malicious injections, bolster examination of AI configuration files and AI-generated code, and leverage automated detection tools. Once a poisoned rule file is incorporated into a project repository, it affects all future code-generation sessions by team members and also survive project forking, creating a vector for supply chain attacks that can affect downstream dependencies and end users.
Original img attribution: https://securityaffairs.com/wp-content/uploads/2025/03/image-39.png
ImgSrc: securityaffairs

Share: bluesky twitterx--v2 facebook--v1 threads


References :
  • securityaffairs.com: Rules File Backdoor: AI Code Editors exploited for silent supply chain attacks
  • The Hacker News: New ‘Rules File Backdoor’ Attack Lets Hackers Inject Malicious Code via AI Code Editors
  • MSSP feed for Latest: Novel Attack Technique Weaponizes AI Code Editors
Classification:
  • HashTags: #AISecurity #SupplyChain #CodeEditors
  • Company: Microsoft
  • Target: AI code editor users
  • Attacker: Cybercriminals
  • Product: GitHub Copilot, Cursor
  • Feature: malicious code injection
  • Malware: Rules File Backdoor
  • Type: Hack
  • Severity: High