Deep fake stripping tools, powered by artificial intelligence, are being made illegal. Here's how blockchain analysis may aid in tracking and prosecuting these illegal acts.
In the digital age, anonymity has become a sought-after feature in various online transactions, particularly in the realm of cryptocurrencies. However, this veil of secrecy has also provided a breeding ground for nefarious activities, such as the proliferation of AI-generated deepfakes, often referred to as "undresser bots."
Despite numerous tools promoting cryptocurrencies as the most anonymous means of payment, the payment addresses associated with these transactions have been identified, allowing for the tracing of both users and operators. This revelation opens up avenues for investigations into the financial infrastructure supporting AI-based non-consensual intimate imagery (NCII) and deepfake creation platforms within the crypto space.
Blockchain analytics firms, such as Investigator, play a crucial role in this pursuit. By examining transaction patterns and addresses involved in payments or fund transfers associated with such illegal AI services, they can trace wallet addresses, analyze transaction flows, and identify clusters of addresses linked to known malicious actors or services.
Legal measures have also been enacted to combat this issue. For instance, the San Francisco city attorney's office has filed lawsuits against multiple websites and applications producing AI-generated NCII, aiming to compel domain registrars, web hosts, and payment processors to cooperate. Meta (Facebook and Instagram) has sued the maker of the CrushAI app, which creates sexually explicit deepfakes, targeting the promotion and advertising of AI tools that generate non-consensual sexual imagery used in sextortion and blackmail.
Internationally, legal action has extended to juvenile courts, such as in Almendralejo, Spain, where minors were sentenced for using AI tools to create and distribute non-consensual deepfake images of peers. These examples illustrate a global recognition and response to the problem.
The cost of generating deepfakes varies, with $1 usually buying enough credits to "undress" 2-4 images, and discounts often offered for bulk purchases. These tools work by using deepfake technologies to undress the victim from an uploaded image or depict them in sexually explicit clothing or scenarios. They have been used in crimes such as revenge porn, sextortion, generation of child sexual abuse material (CSAM), and potentially "pig butchering" scams.
However, despite these laws and regulations, many undresser bots have resurfaced on bespoke websites or under fresh usernames, or have localized versions to bypass blocking attempts. They commonly accept card payments, e-wallets, mobile payment solutions, and Telegram Stars as means of payment. However, just over half of the tools also accepted crypto, with some even offering discounts for crypto payments.
Almost 80% of payments are eventually sent into accounts at centralized KYC exchanges, underscoring the scope for disrupting their operations. The TAKE IT DOWN Act, a US federal bipartisan initiative to prohibit unauthorized explicit images being shared on social media, was signed into law by President Trump in May 2025.
Despite efforts to combat the issue, challenges remain. EUROPOL's 2024 Internet Organised Crime Threat Assessment (IOCTA) noted that "AI-assisted CSAM" will pose challenges to police investigations by increasing volumes of illegal content and making it more difficult to identify victims.
In conclusion, the combination of blockchain analytics and legal frameworks is proving to be an effective strategy in the ongoing battle against AI-generated deepfakes in the crypto ecosystem. As the landscape continues to evolve, it is crucial for law enforcement, regulators, and technology firms to collaborate and adapt their strategies to combat these harmful AI services across jurisdictions.
- Security concerns in the crypto space have become more prominent due to the proliferation of AI-generated deepfakes, also known as "undresser bots."
- Blockchain analytics firms, like Investigator, are essential in tracing wallet addresses and analyzing transaction flows linked to illegal AI services in the crypto world.
- Legal action against AI-generated non-consensual intimate imagery (NCII) and deepfake creation platforms has been enacted globally, including by the San Francisco city attorney's office and Meta (Facebook and Instagram) against the makers of deepfake apps.
- Despite the enactment of laws and regulations, undresser bots have found ways to bypass blocking attempts, including accepting payments in cryptocurrencies, which account for almost 80% of payments eventually sent into accounts at centralized KYC exchanges.
- The evolving landscape of AI-generated deepfakes necessitates ongoing collaboration between law enforcement, regulators, and technology firms to adapt strategies and combat these harmful AI services across jurisdictions.