The Crypto Crime Report: How AI-Driven Scams Are Evolving Faster Than Security
Cryptocurrency fraud has entered a new era. It is no longer about crude phishing emails with broken grammar or obvious fake websites. Today, criminals deploy artificial intelligence to clone voices, manufacture convincing personas, and run automated fraud pipelines that can target thousands of victims simultaneously. The results are staggering.
According to the Chainalysis 2026 Crypto Crime Report, cryptocurrency scams received at least $14 billion on-chain in 2025. Chainalysis projects that figure could exceed $17 billion as more illicit wallet addresses are identified. That represents a sharp jump from the $9.9 billion recorded in 2024. Furthermore, impersonation scams alone grew by a staggering 1,400% year over year.
Perhaps most alarming is the efficiency gap. AI-enabled scams proved 4.5 times more profitable than traditional methods, according to CoinDesk’s analysis of the Chainalysis findings. Fraudsters no longer need to be technically skilled. Instead, they purchase ready-made scam kits, rent AI voice-cloning tools, and follow professionally written scripts. The barrier to entry for crypto crime has never been lower.
This post breaks down how these AI-driven scams actually work, who is being targeted, and crucially, what you can do to protect yourself and your assets in 2026.
The Scale of the Problem: $17 Billion and Rising
Numbers alone rarely convey the human cost of financial crime. Behind every stolen bitcoin is a real person who lost savings, retirement funds, or money they trusted a supposed professional to manage. The aggregate figures in the 2026 crypto crime landscape are therefore worth pausing on.
Chainalysis estimates that 2025 on-chain scam receipts reached at least $14 billion. Based on their historical methodology, which accounts for newly identified illicit wallet addresses discovered after initial reporting, the final figure is projected to exceed $17 billion. For context, that would make crypto scam losses in 2025 larger than the GDP of many small nations.
Meanwhile, traditional cyberattacks, such as exchange hacks and smart contract exploits, accounted for roughly $2.2 billion in stolen funds during 2024. Scams now substantially outpace hacks as the primary driver of crypto losses. This matters because it fundamentally shifts where the weakest point in the system lies. Code vulnerabilities can be patched. Human trust, unfortunately, cannot.
The growth trajectory shows no sign of reversing. Each year, scammers deploy more sophisticated tools, build more convincing fraudulent identities, and tap into an ever-larger pool of cryptocurrency holders. As adoption grows, so does the attack surface.
| Year | Estimated Crypto Scam Losses | Key Dominant Scam Type | Notes |
| 2022 | ~$5.9 billion | Pig butchering/romance scams | Post-bull market peak |
| 2023 | ~$4.6 billion | Investment fraud/rug pulls | Bear market; lower activity |
| 2024 | $9.9B (revised to $12B) | Phishing/pig butchering | Recovery phase; AI tools emerging |
| 2025 | $14B+ (projected $17B+) | AI impersonation / deepfake scams | AI tooling is fully industrialised |
Source: Chainalysis 2026 Crypto Crime Report [1]. Past data is subject to revision as new illicit addresses are identified.
AI-Powered Impersonation: The 1,400% Surge
Impersonation scams are not new. For years, fraudsters have posed as customer support agents, tax authorities, or government officials to extract money or personal details from victims. What changed in 2025 is the quality and scale of that impersonation, driven almost entirely by generative AI tools.
Chainalysis documented a 1,400% year-over-year growth in impersonation scam receipts. That is not a gradual trend. It is a near-vertical spike that reflects the sudden industrialisation of fraud. AI tools now allow criminals to generate convincing fake support chat conversations, produce deepfake video call footage of trusted executives or celebrities, and clone the voices of real people using just a few seconds of audio.
Consider the mechanics. A scammer targeting a victim who holds funds on a well-known exchange can now do the following. First, they send a realistic-looking phishing SMS or email. Then, when the victim calls the support number provided, they are connected to an AI-generated voice that sounds exactly like a real customer service agent. The AI guides the victim through ‘security steps’ that are actually fund transfer instructions.
According to Ledger’s 2026 scam analysis, social engineering has evolved from simple email phishing into a precise, personalised approach. Modern social engineering uses three coordinated fronts: digital identity spoofing, emotional manipulation through urgency or fear, and multi-channel pressure campaigns across SMS, chat apps, and fraudulent websites.
How Deepfakes Are Changing the Game
Deepfake technology has matured at a pace that has outrun most people’s awareness of it. Video and audio deepfakes that once required expensive hardware and specialist skills can now be generated in minutes using consumer-grade tools. Deepfake fraud now appears across every stage of a scam lifecycle.
At the top of the funnel, deepfake videos of celebrities endorsing fraudulent investment platforms flood social media. These videos are often indistinguishable from genuine content, especially when watched on mobile screens at a quick scroll. High-profile names in technology, finance, and entertainment have all had their likenesses hijacked for crypto fraud in 2025.
Further into the scam, deepfakes are used to maintain the deception. Investment scammers running so-called pig butchering operations, where they build trust with a victim over weeks or months before eventually stealing their funds, can now use video deepfakes to simulate face-to-face meetings. A victim who believes they have been video-calling a real relationship partner or financial adviser may be talking to a synthetic persona entirely.
As Trend Micro’s 2026 predictions report notes, AI chatbots, deepfake companions, and manipulated imagery are blurring the line between real and synthetic interactions. Increasingly, multi-channel scams, where victims are pulled from social media into encrypted chat apps and then onto fraudulent payment pages, have become the dominant attack pattern in 2026.
Pig Butchering: The Long Con Goes Industrial
Pig butchering, known in Chinese as sha zhu pan, is a particularly devastating form of cryptocurrency fraud. The name refers to the process of fattening a pig before slaughter. Scammers invest weeks or months building a romantic or platonic relationship with a target before eventually directing them toward a fraudulent investment platform that drains their funds.
Originally associated with large-scale scam compounds in Southeast Asia, pig butchering operations have evolved dramatically. AI tools now allow a single scammer to maintain convincing conversations with dozens of victims simultaneously. AI chat systems handle the routine emotional maintenance of the relationship, flagging escalation points to human operators when it is time to push toward an investment.
The financial damage is severe. Average payment sizes have risen sharply as criminals shifted away from spray-and-pray tactics toward fewer but more lucrative targets. Victims who have been carefully cultivated over weeks tend to invest far larger sums than those who are targeted cold. In one widely reported UK case, a man lost nearly $2.5 million in a single bitcoin scam that police described as exploiting ‘fear’ and ‘panic’ through sophisticated social engineering, as CoinDesk reported.
Furthermore, the geographic distribution of pig butchering operations is widening. While Southeast Asia remains a major hub, operations have been identified in Eastern Europe, West Africa, and increasingly within Western nations. The scam-as-a-service model means that local criminal networks can rent the infrastructure without building it themselves.
Phishing-as-a-Service: The Professionalisation of Fraud
One of the most significant structural shifts in crypto crime over the past two years is the rise of phishing-as-a-service platforms. These are commercial criminal enterprises that provide end-to-end fraud infrastructure to paying customers. A would-be scammer no longer needs technical skills. They simply subscribe to a service, choose their target type, and follow the provided playbook.
These platforms typically include customisable phishing page templates that mimic major exchanges and wallets, automated SMS sending tools with localised messaging, victim tracking dashboards, cryptocurrency wallet drainer scripts, and money laundering routing instructions. The level of sophistication rivals legitimate software-as-a-service businesses, complete with customer support and regular feature updates.
Wallet drainer scripts are a particularly dangerous component. According to Chainalysis, AI-enabled scams are characterised by higher incoming transfer rates and higher daily USD volumes because drainer tools can extract funds without requiring extended victim engagement. A single click on a malicious link can be enough to authorise a smart contract that empties a victim’s entire wallet.
The FBI’s Internet Crime Complaint Centre (IC3) received over 69,000 complaints related to cryptocurrency fraud in 2024, with losses exceeding $5.6 billion in the US alone. Phishing and impersonation were among the most frequently cited attack vectors, confirming that the service-based model has made fraud accessible to a much wider pool of criminals.
| Scam Type | 2025 Growth Rate | Avg. Victim Loss | Primary AI Tool Used | Primary Target Profile |
| Impersonation scams | +1,400% YoY | $8,000 – $50,000+ | Voice cloning, chatbots | Retail crypto holders |
| Pig butchering | Steady high volume | $100,000 – $2.5M+ | AI chat automation | Affluent individuals, age 35-65 |
| Wallet drainer scripts | Rapidly growing | Full wallet balance | Smart contract automation | DeFi / NFT users |
| Deepfake investment fraud | Surging in 2025-26 | $10,000 – $500,000+ | Video/audio deepfakes | Retail investors via social media |
| High-volume impersonation (SMS) | Dominant volume | $500 – $5,000 | Smishing kits, AI copywriting | The broad general population |
Sources: Chainalysis 2026 Crypto Crime Report [1], CoinDesk [2], Trend Micro [3], FBI IC3 [4].
Why AI Makes Crypto Scams 4.5x More Profitable
The 4.5x profitability multiplier that Chainalysis attributes to AI-enabled scams is not simply about automation. It reflects a fundamental shift in how attackers allocate their effort. Traditional scams were limited by human bandwidth. Each scammer could maintain only so many conversations, craft only so many messages, and pursue only so many leads at once.
AI removes that constraint almost entirely. A well-configured AI system can simultaneously manage hundreds of ongoing victim relationships, generate personalised messages that incorporate details the victim has previously mentioned, simulate emotional connection with persuasive consistency, and detect when a victim is wavering so it can flag them for human intervention. The conversion rate on a well-targeted, AI-assisted scam is therefore substantially higher than any human-operated equivalent.
Additionally, AI allows scammers to dramatically improve the quality of their fake materials. Previously, phishing pages were often identifiable by poor English, generic branding, or obvious template layouts. Today, AI-generated phishing pages are indistinguishable from the real platforms they mimic. AI-written messages are grammatically perfect, culturally calibrated, and emotionally resonant.
From a cost perspective, AI tools are also remarkably cheap relative to the returns they generate. Voice-cloning services are available for under $20 per month. Large language model APIs can process thousands of victim conversations for a few hundred dollars. The return on investment for a successful pig butchering operation running on AI infrastructure can be astronomical. Consequently, more criminal groups are investing in AI capabilities, creating a self-reinforcing cycle of escalation.
Address Poisoning and Technical Attack Vectors
Not all AI-driven crypto fraud targets human psychology. Some attacks exploit the technical habits of crypto users. Address poisoning is one of the most insidious examples. This technique exploits a common behaviour: copying and pasting wallet addresses from transaction history rather than entering them manually.
In an address poisoning attack, the scammer sends a tiny transaction, often worth fractions of a cent, from a wallet address that closely resembles one the victim regularly transacts with. The fake address typically shares the first several and last several characters with the legitimate one. When the victim copies what they believe is a recent transaction partner’s address from their history, they may copy the poisoned address instead.
As Ledger’s security guide explains, this attack requires no social engineering. It simply exploits a practical human habit. AI tools make address poisoning more scalable by automating the generation of similar-looking addresses and the broadcasting of poisoning transactions across many potential victims simultaneously.
Similarly, clipboard hijacking malware can silently replace a copied wallet address with a scammer’s address the moment a user hits paste. Users who do not verify the destination address before confirming a transaction may send funds directly to a criminal without any interaction with the scammer at all. This class of attack is particularly dangerous because it requires no trust-building phase.
Money Laundering Networks: The Infrastructure Behind the Fraud
Stealing cryptocurrency is only half the challenge for criminals. The other half is converting those stolen funds into usable money without triggering law enforcement action. The sophistication of crypto money laundering networks has grown in step with the scams they service.
According to Chainalysis, professional money laundering networks play a central role in the current scam ecosystem. These networks accept stolen crypto from multiple scam operations simultaneously, mixing funds from different sources to obscure their origins. They route funds through mixers and tumblers, decentralised exchanges with weak KYC requirements, cross-chain bridges, and peer-to-peer trading platforms.
The 2025 law enforcement actions highlighted in the Chainalysis report reflect this complexity. Two of the largest-ever crypto-related enforcement actions targeted scam-connected money laundering operations rather than the scammers themselves. Cases involving Jian Wen and Yadi Zhang demonstrated that even sophisticated multi-stage laundering operations leave on-chain traces that analytics firms can follow.
Nevertheless, the volume of illicit funds being laundered continues to grow faster than enforcement can respond.FATF (the Financial Action Task Force) has repeatedly flagged the gaps in virtual asset regulation across jurisdictions. Where one country tightens its rules, illicit flows tend to shift to less regulated markets. The global patchwork of crypto AML regulation remains a significant vulnerability.
Who Gets Targeted? Victim Profiles in 2026
Understanding who scammers target helps both individuals and institutions build better defences. Fraud is not random. Criminals research their targets, select the most promising ones, and tailor their approach accordingly. The data from 2025 reveals some clear patterns.
High-net-worth crypto holders are the primary targets for pig butchering and deepfake investment fraud. Scammers typically source their leads from data broker databases, social media activity, and leaked data from previous breaches. Someone who posts publicly about cryptocurrency gains, attends crypto conferences, or belongs to relevant LinkedIn groups is providing scammers with a ready-made target profile.
Older retail investors are disproportionately targeted by impersonation scams. Research consistently shows that older adults are more likely to trust official-sounding communications and less likely to be aware of current scam techniques. The emotional manipulation tactics used by AI-powered scammers, specifically exploiting fear of financial loss or the excitement of a once-in-a-lifetime opportunity, are highly effective against this demographic.
Meanwhile, DeFi and NFT users face distinct technical risks. Wallet drainer scripts and address poisoning attacks specifically target people who regularly interact with decentralised finance protocols and NFT marketplaces. These users are technically sophisticated enough to use self-custody wallets, which makes them prime targets for attacks that bypass human judgment entirely.
The Role of Social Media in Enabling Crypto Fraud
Social media platforms have become the primary hunting ground for crypto scammers. The combination of pseudonymity, algorithmic amplification, and weak content moderation creates an ideal environment for fraud. Moreover, the economics of social media advertising make it relatively cheap to reach large audiences with fraudulent content.
Deepfake celebrity endorsement videos are among the most prevalent forms of social media crypto fraud. These videos place well-known figures, from tech billionaires to popular financial commentators, in apparently genuine promotional content for fraudulent platforms. By the time platforms remove these videos, they have often been viewed millions of times and redirected thousands of potential victims to scam websites.
Telegram and WhatsApp channels play a different but equally important role. After an initial contact on a mainstream platform, scammers quickly move victims to encrypted messaging apps where there is less oversight and more privacy. Fake investment groups on these platforms create social proof by flooding channels with fabricated testimonials, fake profit screenshots, and AI-generated community activity.
Furthermore, LinkedIn has become a significant vector for professional-grade impersonation. Scammers create convincing fake profiles of financial advisers, wealth managers, and crypto fund executives. They target professionals through connection requests and investment opportunities that appear entirely plausible within a business networking context.
Regulatory Responses: AML, KYC, and the January 2026 Mandates
Regulators globally have been scrambling to keep pace with the evolution of crypto crime. Several significant regulatory developments in early 2026 are reshaping the compliance landscape for legitimate businesses and creating new friction for criminal operations.
January 2026 marked a significant enforcement threshold for AML and KYC requirements across multiple jurisdictions. Global banks in key markets are now required to disclose digital asset exposure. NewCCPA rules in California target shady digital data brokers, cutting off a key source of victim contact information for scammers. These regulatory changes collectively increase the compliance burden on all participants in the crypto ecosystem.
The EU’s Markets in Crypto-Assets (MiCA) regulation, which came into full effect in late 2024, introduced comprehensive licensing requirements for crypto asset service providers across Europe. MiCA requires robust AML controls, consumer protection disclosures, and regular regulatory reporting. Compliance-focused exchanges operating under MiCA have substantially less exposure to money laundering than unregulated alternatives.
Despite these positive developments, significant gaps remain. Peer-to-peer crypto transactions, decentralised exchanges, and cross-border fund flows still operate largely outside the regulatory perimeter. Scammers are adept at routing funds through these unregulated channels, effectively side-stepping the compliance controls that apply to licensed businesses. Truly closing these loopholes requires coordinated international action that has so far proved elusive.
| Regulation / Body | Jurisdiction | Key Requirement | Impact on Crypto Crime |
| MiCA | European Union | Licensing of crypto asset service providers; AML controls | Reduces laundering via EU-regulated exchanges |
| Travel Rule (FATF) | Global (partial) | Sharing of sender/recipient data for transfers above the threshold | Creates a data trail for law enforcement |
| Bank Secrecy Act / FinCEN rules | United States | SAR filing; MSB registration for crypto businesses | Increases compliance costs for scam operations |
| CCPA amendments (2026) | California, US | Stricter data broker rules; consumer data rights | Limits scammer access to victim contact data |
| FCA crypto registration | United Kingdom | Registration required for crypto firms; AML compliance | Creates accountability for UK-based platforms |
Sources: ESMA, FinCEN, FCA, California AG. Regulatory landscape evolving rapidly; verify current requirements with legal counsel.
Law Enforcement Fights Back: Record Seizures in 2025
Despite the scale of the problem, 2025 also brought some of the most significant crypto crime enforcement actions in history. Law enforcement agencies in multiple countries demonstrated that on-chain transparency, while not foolproof, does provide meaningful investigative tools.
Chainalysis documented two of the largest-ever crypto-related enforcement actions in 2025, both directly connected to scam money laundering operations. These cases illustrate a strategic shift in enforcement priorities. Targeting the money laundering layer, rather than individual scammers, disrupts entire criminal ecosystems at once. A single successful action against a professional laundering network can cut off revenue flows from dozens of active scam operations.
The US Department of Justice has significantly expanded its dedicated cryptocurrency fraud units. Similarly, Europol’s Joint Cybercrime Action Taskforce has increased its focus on crypto-enabled fraud networks. International cooperation on evidence-sharing and asset freezing is improving, though it remains hampered by jurisdictional complexity.
Importantly, blockchain analytics have become central to most major crypto crime investigations. Tools from firms like Chainalysis, Elliptic, and TRM Labs allow investigators to trace fund flows across hundreds of wallet hops, identify clustering patterns that reveal criminal network structures, and ultimately link on-chain activity to real-world identities. This analytical capability is one of the most powerful deterrents available to law enforcement.
How to Protect Yourself From AI-Driven Crypto Scams
Awareness is your first line of defence. Understanding how these scams work significantly reduces your vulnerability. Below are the most important protective measures for anyone who holds or trades cryptocurrency in 2026.
Verify Before You Trust
Never act on unsolicited contact. Whether it arrives via text, email, social media, or phone call, treat any unexpected approach involving cryptocurrency with extreme scepticism. Legitimate exchanges, government agencies, and financial institutions do not contact you demanding immediate action. If you receive such a message, contact the organisation directly using contact details from their official website, not details provided in the message itself.
Furthermore, always verify wallet addresses character by character before confirming any transaction. Never rely solely on copying from your transaction history. Use address book features in trusted wallets and double-check the full string, not just the first and last few characters, since address poisoning attacks specifically exploit partial checking.
Secure Your Devices and Accounts
Enable two-factor authentication (2FA) on every exchange account and email address. Use an authenticator app rather than SMS-based 2FA, as SIM-swapping attacks can compromise SMS codes. Consider a hardware security key for high-value accounts. Keep your devices updated, use reputable antivirus software, and be alert to clipboard hijacking by always reading the pasted address after pasting, before confirming any transaction.
For significant cryptocurrency holdings, a hardware wallet is not optional; it is essential. Hardware wallets keep private keys offline and require physical confirmation of every transaction. No remote attacker, no matter how sophisticated their AI tools, can drain a hardware wallet without physical access to the device.
Be Sceptical of Investment Opportunities
If an investment opportunity finds you rather than you finding it, be highly suspicious. Legitimate investment returns are never guaranteed. Any platform or individual promising consistent high returns, special access to exclusive deals, or recovery of previous scam losses is almost certainly fraudulent. The FCA’s ScamSmart tool allows UK residents to check whether a firm is authorised and whether it has been flagged for fraudulent activity.
Similarly, be cautious of online relationships that quickly move toward financial topics. Whether the contact presents as a romantic interest, a new business partner, or a fellow crypto enthusiast, any relationship that progresses toward investment advice or fund transfers within a short period should raise immediate red flags. Consult a trusted friend, family member, or qualified financial adviser before making any significant financial decision based on an online relationship.
What Exchanges and Platforms Are Doing
The responsibility for fighting crypto crime does not rest solely with individual users. Exchanges, wallet providers, and DeFi platforms all have a role to play. In 2025 and 2026, several industry initiatives have shown meaningful progress.
Major exchanges have invested heavily in transaction-monitoring systems that flag suspicious wallet addresses, unusual transfer patterns, and known scam-linked addresses in real time. Some platforms now include warning prompts when users attempt to send funds to newly created wallets or wallets associated with previous fraud reports. While these controls are not foolproof, they do provide a meaningful speed bump that prevents some transactions from completing.
Proactive user education campaigns have also expanded. Coinbase, Binance, Kraken, and other major platforms regularly publish guidance on current scam techniques, maintain blacklists of known fraudulent addresses, and operate dedicated fraud support teams. Some exchanges have introduced cooling-off periods for large first-time withdrawals to high-risk destinations, giving victims time to reconsider.
At the protocol level, some DeFi projects have introduced simulation tools that allow users to preview the outcome of a smart contract interaction before signing it. This directly addresses the wallet drainer vulnerability by making the consequences of signing a malicious contract visible before any funds are at risk. Wider adoption of transaction simulation across the DeFi ecosystem could substantially reduce losses from technical attack vectors.
The Road Ahead: Will Security Ever Catch Up?
The central tension in the 2026 crypto crime landscape is a familiar one from other technological domains: offence moves faster than defence. Scammers adopt new AI tools immediately, unencumbered by compliance requirements, ethical constraints, or organisational bureaucracy. Defence responses, by contrast, require coordination across regulators, platforms, law enforcement, and users, all of which take time.
However, there are meaningful reasons for cautious optimism. Blockchain analytics capabilities continue to improve rapidly. On-chain transparency remains a fundamental property of public blockchains that no amount of mixing or routing can fully eliminate. As analytics tooling matures, the traceability of stolen funds will only increase.
Additionally, AI-based fraud detection is advancing on the defensive side, too. Platforms are deploying machine learning models that can identify suspicious transaction patterns, flag AI-generated communication content, and detect behavioural anomalies that signal an account may have been compromised. The same technology that empowers scammers can, properly applied, be used to identify and stop them.
The regulatory trajectory is also broadly positive. As major jurisdictions implement comprehensive crypto AML frameworks, the compliance cost for criminal operations rises while their operational flexibility decreases. Ultimately, however, no amount of technology or regulation can fully substitute for informed, sceptical users who understand the risks they face. Education remains the most scalable defence available.
Key Takeaways for Crypto Holders in 2026
1. The scale is unprecedented. Crypto scam losses are projected to exceed $17 billion in 2025, driven primarily by AI-enabled fraud.
2. AI has industrialised fraud. Deepfakes, voice cloning, and automated chat tools make scams far more convincing and scalable than anything seen before.
3. Impersonation is the primary attack vector. A 1,400% growth rate in impersonation scams means the most dangerous threat is not a hack, it is a convincing fake.
4. Technical attacks bypass human judgment. Address poisoning and wallet drainers can steal funds without any sustained victim interaction.
5. Hardware wallets are essential for significant holdings. They represent the most reliable barrier against remote fund theft currently available.
6. Verify everything independently. Never trust contact details provided in unsolicited messages, and always verify wallet addresses character by character.
7. Regulation is improving, but gaps remain. MiCA, FATF travel rules, and tightened AML requirements are raising the bar, but decentralised and peer-to-peer channels remain weakly regulated.
8. Report scams promptly. Reporting to platforms, national cybercrime units, and bodies like the FBI, IC3, or Action Fraud in the UK helps law enforcement track and disrupt criminal networks.
Spend some time for your future.
To deepen your understanding of today’s evolving financial landscape, we recommend exploring the following articles:
Negotiate Credit Card Debt: Lower Rates and Balances
Why Did My Credit Score Drop? 11 Common Reasons & How to Fix
War Economy Chapter 14: Volatility Explained – Why Prices Swing Wildly in Wartime
Bitcoin Climate Impact 2026: Carbon Emissions, Water Use & Mining Footprint Explained
Explore these articles to get a grasp on the new changes in the financial world.
Disclaimer
This article is for informational and educational purposes only. It does not constitute financial, legal, or investment advice. Cryptocurrency investments carry significant risk. Always consult a qualified financial adviser before making investment decisions. The statistics cited reflect publicly available research and may be subject to revision.
References
[1] Chainalysis, ‘2026 Crypto Crime Report: Scams,’ Chainalysis Blog, January 2026.
[4] Ledger, ‘The State of Crypto Scams in 2026,’ Ledger Academy, updated February 5, 2026.
[5] FBI Internet Crime Complaint Centre (IC3), ‘2024 Internet Crime Report,’ FBI, 2025.
[9] US Department of Justice, ‘Cryptocurrency Fraud,’ DOJ Criminal Division, 2025.
[10] Europol, ‘Cybercrime,’ Europol Crime Areas, 2025.
[11] FBI, ‘FBI Warns of Fraudulent Schemes Leveraging Cryptocurrency,’ FBI Press Release, 2024.
[12] FCA, ‘ScamSmart: Avoid Investment and Pension Scams,’ Financial Conduct Authority, 2026.
[13] CISA, ‘More Than a Password: Implement Multi-Factor Authentication,’ CISA, 2025.
[14] IBM Security, ‘Fraud Protection Solutions,’ IBM, 2025.[15] FTC, ‘Data Brokers: A Call for Transparency and Accountability,’ Federal Trade Commission, 2024.


