GoldDigger Malware Using Deep Fake AI Photos To Hijack Bank Accounts

Hackers use deep fake AI photos to impersonate individuals online, allowing them to deceive, manipulate, or gain unauthorized access to sensitive information or systems.  Cybersecurity researchers at InfoBlox recently discovered GoldFamily, an evolved GoldDigger trojan targeting iOS devices to steal facial recognition data and bank access using AI for biometric authentication attacks.  To defend proactively, Infoblox’s DNS Early Detection Program identifies potentially malicious domains rapidly before appearing on threat feeds, enabling early blocking to prevent attacks before the kill chain unfolds.  GoldDigger Malware & Deep Fake GoldFamily is an advanced version of GoldDigger that uses trickery to get people to give them facial recognition data and personal identification documentation.  It focuses on Android (malicious app install) and iOS (MDM profile install directing to TestFlight URL) users, where it steals facial data, intercepts SMS, asks for ID images and serves as a network proxy once downloaded. Document Integrate ANY.RUN in Your Company for Effective Malware Analysis Are you from SOC, Threat Research, or DFIR departments?

Source: GBHackers

 


Date:

Categorie(s):