Close Menu

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Papablic Launches World’s First Portable Bottle Warmer with Built-in Sterilization

    HotSpot Therapeutics Presents Preclinical Data from Small Molecule CBM Signalosome Inhibitor Program at ESMO Gastrointestinal Cancers Congress 2025

    Angels vs. Blue Jays Highlights | MLB on FOX

    Facebook X (Twitter) Instagram
    Facebook X (Twitter) Instagram Pinterest VKontakte
    Sg Latest NewsSg Latest News
    • Home
    • Politics
    • Business
    • Technology
    • Entertainment
    • Health
    • Sports
    Sg Latest NewsSg Latest News
    Home»Technology»Researchers cause GitLab AI developer assistant to turn safe code malicious
    Technology

    Researchers cause GitLab AI developer assistant to turn safe code malicious

    AdminBy AdminNo Comments2 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Share
    Facebook Twitter LinkedIn Pinterest Email



    Marketers promote AI-assisted developer tools as workhorses that are essential for today’s software engineer. Developer platform GitLab, for instance, claims its Duo chatbot can “instantly generate a to-do list” that eliminates the burden of “wading through weeks of commits.” What these companies don’t say is that these tools are, by temperament if not default, easily tricked by malicious actors into performing hostile actions against their users.

    Researchers from security firm Legit on Thursday demonstrated an attack that induced Duo into inserting malicious code into a script it had been instructed to write. The attack could also leak private code and confidential issue data, such as zero-day vulnerability details. All that’s required is for the user to instruct the chatbot to interact with a merge request or similar content from an outside source.

    AI assistants’ double-edged blade

    The mechanism for triggering the attacks is, of course, prompt injections. Among the most common forms of chatbot exploits, prompt injections are embedded into content a chatbot is asked to work with, such as an email to be answered, a calendar to consult, or a webpage to summarize. Large language model-based assistants are so eager to follow instructions that they’ll take orders from just about anywhere, including sources that can be controlled by malicious actors.

    The attacks targeting Duo came from various resources that are commonly used by developers. Examples include merge requests, commits, bug descriptions and comments, and source code. The researchers demonstrated how instructions embedded inside these sources can lead Duo astray.

    “This vulnerability highlights the double-edged nature of AI assistants like GitLab Duo: when deeply integrated into development workflows, they inherit not just context—but risk,” Legit researcher Omer Mayraz wrote. “By embedding hidden instructions in seemingly harmless project content, we were able to manipulate Duo’s behavior, exfiltrate private source code, and demonstrate how AI responses can be leveraged for unintended and harmful outcomes.”

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Admin
    • Website

    Related Posts

    Football Fans In The UK Will Be Able To Watch Every Match Of This Summer’s FIFA Club World Cup FREE On DAZN

    Draft proposal looks to put EHR reform measures back on the table

    Airbus’ HTeaming gives helicopter crews in-flight UAS control   

    Get ready for watchOS 26 with $100 off a brand new Apple Watch Series 10

    Add A Comment
    Leave A Reply Cancel Reply

    Editors Picks

    Microsoft’s Singapore office neither confirms nor denies local layoffs following global job cuts announcement

    Google reveals “material 3 expressive” design – Research Snipers

    Trump’s fast-tracked deal for a copper mine heightens existential fight for Apache

    Top Reviews
    9.1

    Review: Mi 10 Mobile with Qualcomm Snapdragon 870 Mobile Platform

    By Admin
    8.9

    Comparison of Mobile Phone Providers: 4G Connectivity & Speed

    By Admin
    8.9

    Which LED Lights for Nail Salon Safe? Comparison of Major Brands

    By Admin
    Sg Latest News
    Facebook X (Twitter) Instagram Pinterest Vimeo YouTube
    • Get In Touch
    © 2025 SglatestNews. All rights reserved.

    Type above and press Enter to search. Press Esc to cancel.