ylai@lemmy.ml to AI Infosec@infosec.pubEnglish · 9 months agoAI hallucinates software packages and devs download them – even if potentially poisoned with malwarewww.theregister.comexternal-linkmessage-square3fedilinkarrow-up142arrow-down10cross-posted to: technology@lemmy.worldcybersecurity@infosec.pubopensource@lemmy.mltechnology@beehaw.orgtechnology@lemmy.zipartificial_intel@lemmy.mltechnology@lemmy.world
arrow-up142arrow-down1external-linkAI hallucinates software packages and devs download them – even if potentially poisoned with malwarewww.theregister.comylai@lemmy.ml to AI Infosec@infosec.pubEnglish · 9 months agomessage-square3fedilinkcross-posted to: technology@lemmy.worldcybersecurity@infosec.pubopensource@lemmy.mltechnology@beehaw.orgtechnology@lemmy.zipartificial_intel@lemmy.mltechnology@lemmy.world
minus-squareSyd@lemm.eelinkfedilinkEnglisharrow-up7·9 months agoSo could a bad actor train llms to inject malware into code in a way that wouldn’t be easily caught?
minus-squareBlazeDaley@lemmy.worldlinkfedilinkEnglisharrow-up3·9 months agoYes. https://www.anthropic.com/news/sleeper-agents-training-deceptive-llms-that-persist-through-safety-training
So could a bad actor train llms to inject malware into code in a way that wouldn’t be easily caught?
Yes.
https://www.anthropic.com/news/sleeper-agents-training-deceptive-llms-that-persist-through-safety-training