In April 2026, hackers stole 4TB of voice samples from 40,000 AI contractors at Mercor, combining clean voice recordings with government IDs. This enables attackers to create high-quality voice clones for bank fraud, romance scams, and workplace impersonation. Victims should rotate voiceprints, set verbal codewords with family, and use forensic tools to detect suspicious audio claiming to be from known contacts.