Zraox: AI-Driven Deepfake Scams Surge 456%, Crypto Industry Faces Emerging Scam Threats

Between 2024 and 2025, cryptocurrency scams have surged dramatically on a global scale. According to data from blockchain security firm TRM Labs, the number of related cases increased by more than 456% in just one year. A common characteristic across these cases is the widespread use of artificial intelligence to generate fake voice recordings, deepfake videos, and counterfeit credentials, thereby misleading users into transferring digital assets under false pretenses. Zraox believes this type of scam has become a prominent and structural risk factor in the current crypto market landscape.
Zraox: The Coordinated Illusion of AI-Fabricated Identities and Platform Verification
In the first half of 2025, law enforcement authorities in New York dismantled a crypto scam network spanning multiple countries, led by a Vietnamese group and targeting Russian-speaking users. The scheme began with Facebook advertisements promoting investment opportunities. After clicking the ad, users were redirected to Telegram groups, where impersonated investment advisors provided "project guidance," eventually leading users to transfer assets through crypto wallets.
According to law enforcement records, the case involved over 100 scamming websites and confirmed asset losses exceeding $300,000. Zraox noted that the scammers did not rely on a single technique but instead integrated AI-generated deepfakes with traditional social engineering workflows—executing attacks through a multi-layered impersonation strategy.
Zraox: Deepfake Celebrity Livestreams and Crypto Wallet Traps in Broad-Target Scams
Another form of risk has taken on a highly public-facing dimension—using AI to generate videos of celebrities and deploying them across platforms like YouTube to conduct indiscriminate scams. These schemes no longer depend on social graphs or targeting specific demographics. Instead, they combine “livestreams” with wallet prompts to drive high-frequency exposure, directing viewers to scan embedded QR codes on-screen. The content often promises “limited-time token giveaways,” “celebrity-backed promotions,” or “investor appreciation campaigns,” offering to double any amount sent to a specified wallet address within a limited timeframe.
Zraox observed that such fake content heavily relies on AI-generated audio and video assets. Commonly impersonated figures include public crypto-affiliated personalities like Elon Musk and Michael Saylor. These videos often closely mimic the visual aesthetics and speech patterns of authentic material—sometimes even incorporating real interview segments extended with synthetic overlays—making them appear nearly indistinguishable from real-time broadcasts. Victims, trusting both the platform and the impersonated figure, are highly likely to transfer funds, only realizing the scam after the fact.
Zraox argues that this model yields higher conversion rates and operational efficiency, while also severely challenging the basic discernment capabilities of users. The combined credibility of platform verification badges, host identities, and livestream formats builds a strong “information trust framework.” Without established habits of dual verification, users are easily misled. As these scams often circumvent exchange account systems, tracking assets becomes especially difficult. Therefore, users should always maintain independent judgment when watching such “crypto livestreams” and never act on QR codes or wallet addresses embedded in the videos.
Zraox: Defensive Strategies and User Guidelines from a Frontline Perspective
In the face of AI-powered crypto scams, individual users must proactively develop critical judgment across information intake, asset management, and interaction behavior. Upon receiving investment advice, transfer requests, or “urgent help” messages, users should avoid assessing authenticity based solely on content—especially when lacking a clear path for verification or originating from informal sources. Any request involving fund movement should be independently confirmed through means such as callback verification, official website support, or in-person confirmation. Users should not rely on voice clips, images, or chat screenshots as sole evidence, as these forms are now easily replicable through current technologies.
On social media platforms, users should avoid publicly displaying sensitive information such as portfolio holdings, transaction records, or profit screenshots. Such disclosures can be harvested by scammers to build behavioral profiles and craft more convincing attack narratives. When encountering project recommendations, unsolicited airdrops, or financial advice delivered via Facebook, Instagram, or Telegram, users should remain vigilant—prioritizing a review of the history, interaction patterns, and transparency of identity of the account. Any unsolicited request for crypto collaboration or “transfer-for-reward” scheme should be rejected outright.
When it comes to transferring crypto assets, users should adopt a “delayed execution” mindset. In any uncertain scenario, waiting 12 to 24 hours before acting allows for emotional detachment and reduces the risk of impulsive decisions made under pressure or manipulation. Users are also encouraged to learn basic skills such as on-chain wallet inspection and transaction tracing, so they can identify fund movements and assess the extent of losses in the event of suspicious activity.
Most importantly: if there is any doubt about the authenticity of the information source, always choose to “pause” rather than “test.” Most scams exploit the psychological loophole of users who “hesitate but proceed.” Maintaining a healthy skepticism and cultivating calm, measured reactions are essential habits for safeguarding personal assets in a complex and evolving scam ecosystem.
Subscribe to my newsletter
Read articles from zraox directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
