AI Music Fraud Explained: The $9M Streaming Scam and How to Protect Your Revenue
What Is AI Music Fraud? Inside the $9 Million Streaming Scam
If you're searching for "AI music fraud," you probably want to know three things: how the scam actually worked, how it affects you and the industry, and what you can do to protect yourself as a legitimate artist. This article answers all three.
In 2024, the U.S. Department of Justice arrested North Carolina-based music producer Michael Smith on charges of wire fraud, money laundering, and copyright fraud conspiracy. The scheme netted approximately $9 million USD, making it the first criminal case in the United States involving AI-powered streaming fraud — and it made headlines worldwide.
How the Scam Worked: Mass AI Track Generation and Bot Manipulation
The fraud was remarkably systematic. Here's how the scheme stacked up illegal revenue, step by step:
- Mass track generation using AI tools: Using AI music generation software, Smith created tens of thousands of tracks at a pace no human could match.
- Fake artist accounts for distribution: He created multiple fictitious artist identities and uploaded the tracks en masse to major streaming platforms including Spotify, Apple Music, and Amazon Music.
- Bot-driven stream inflation: Automated software bots simulated human listening activity on these tracks. Spotify alone reportedly recorded hundreds of millions of fraudulent streams.
- Collecting royalties as legitimate income: Streaming platforms' algorithms couldn't distinguish bot plays from real listeners, so royalty payments were issued based on the inflated stream counts.
The scheme allegedly involved AI tool developers and distribution service insiders, making this far more than a one-man operation — it was an organized criminal enterprise. According to the DOJ, Smith ran the fraud continuously from 2017 to 2024, a span of seven years.
Why Independent Artists Take the Hardest Hit
You might think this doesn't affect you personally — but this fraud causes real, direct financial harm to independent artists. To understand why, you need to know how streaming royalties are actually calculated.
The Pro-Rata Royalty Pool: How Streaming Pays Out
Major platforms like Spotify and Apple Music pool all subscriber revenue and distribute it based on each artist's share of total streams on the platform. This is called the pro-rata model.
When fraudulent streams flood the pool, they eat into the share that legitimate artists would otherwise receive. Even if your music performs well, bot-inflated streams from fake accounts effectively dilute your cut. It's a structural problem baked into how royalties are calculated.
"Bot streaming isn't just fraud — it's theft from every honest artist on the platform." — Statement from the RIAA (Recording Industry Association of America)
The Collateral Damage Risk: Legitimate Artists Getting Flagged
As platforms like Spotify strengthen their AI-detection algorithms in response to fraud, their systems aren't perfect — and legitimate independent artists have reported having their tracks removed or accounts frozen due to false positives. When Spotify conducted a large-scale removal of AI-generated content in 2024, a number of genuine creators were caught in the dragnet.
How Streaming Platforms Are Responding
So what are the major platforms actually doing about this?
Spotify's Approach
According to the Spotify Newsroom, the company is continuously improving its machine-learning-based bot detection systems and has cut off royalty payments to accounts found to have fraudulent streams. In 2024, Spotify also introduced a new policy eliminating royalty payments for tracks with fewer than 1,000 streams — a move partly aimed at discouraging AI track farming.
Apple Music and Amazon Music
Both Apple and Amazon operate their own fraud detection systems and have ramped up monitoring of suspicious accounts. They've also tightened vetting requirements for distribution partners, applying stricter scrutiny to accounts that upload large volumes of tracks in short timeframes.
Action at the Distributor Level
Independent distribution services like DistroKid, TuneCore, and CD Baby have all strengthened their fraud detection, account suspension, and payment hold protocols. The Smith case implicated distribution insiders as co-conspirators, putting the entire industry on notice and accelerating enforcement efforts across the board.
Practical Steps Independent Artists Can Take Right Now
Here's what you can do today to protect your music and your income as a legitimate independent artist.
① Choose a Reputable Distributor
Start by distributing your music through an established, trusted service — DistroKid, TuneCore, CD Baby, and Landr all have direct relationships with major platforms and transparent revenue reporting. Once your music is live, check your dashboard regularly and watch for any unusual spikes or drops in streams or earnings.
② Register Your Copyright
In the US, registering with the U.S. Copyright Office gives you legal standing to pursue infringement claims and is essential if you ever need to take action in court. Internationally, organizations like ASCAP, BMI, or SESAC can help collect performance royalties and monitor usage. Make sure your metadata — artist name, ISRC codes, UPC codes — is accurate and complete across every release.
③ Document Your Work Before You Release It
Before releasing any track, save the completed audio file with a timestamp — via cloud storage version history, email to yourself, or a dedicated backup service. This establishes a clear record that you are the original creator, which can be critical if your work is ever disputed or misappropriated.
④ Watch for Suspicious Stream Activity
If you suddenly notice a flood of plays coming from countries or regions that have no connection to your audience, pay attention. This could indicate that your tracks are being misused, or that your account has been compromised in some way. Contact your distributor and the platform's artist support team immediately if something looks off.
⑤ Disclose AI Use Transparently
Spotify and Apple Music are increasingly requiring disclosure when AI-generated content is involved. If you use AI tools in your creative process, being upfront about it while ensuring your work reflects genuine human creativity is the best long-term strategy for keeping your account in good standing.
AI and Music Production: Using It the Right Way
The Michael Smith case isn't an indictment of AI itself — the problem was weaponizing AI for fraud, not using it as a creative tool. Used ethically, AI can be an incredibly powerful partner for independent artists, boosting both the quality and efficiency of your production workflow.
For example, LA Studio is a browser-based DAW that offers AI-powered features like vocal removal, stem separation, noise reduction, and Auto-Tune — all completely free with no installation required. Using tools like these to elevate your original music and release it through legitimate channels is exactly how healthy participation in the music ecosystem looks.
If you want to explore what AI can do for your production, try the stem splitter or vocal remover to analyze reference tracks or create practice materials.
What This Case Means for the Future of the Music Industry
This landmark criminal case marks a genuine turning point for the industry. Here's what we can expect to accelerate in the coming years:
- More sophisticated fraud detection: Platforms will keep advancing their machine learning models, improving their ability to identify bot activity
- Stricter distributor gatekeeping: Expect tighter controls on bulk catalog uploads and new account verification requirements
- Mandatory AI disclosure: Regulatory momentum in the US and Europe is building toward legally required disclosure of AI involvement in released music
- Blockchain-based rights management: Immutable, decentralized ownership records for music are gaining traction as a way to verify authenticity
- A shift toward user-centric royalties: The debate over replacing pro-rata with a model that pays artists based on what specific listeners actually choose to hear is picking up steam
Staying informed about these shifts is essential if you're building a long-term career as an independent artist.
Summary
The $9 million AI music fraud case is a landmark example of just how organized and large-scale AI-powered crime can become. For independent artists, the threat is twofold: direct loss of royalty income and the risk of being caught up in platform crackdowns. But the path forward is clear. Work with trusted distributors, register your copyrights, keep your metadata accurate, and stay alert to unusual activity on your releases — these steps go a long way toward protecting both your music and your livelihood.
AI is a powerful ally when used with integrity. Take advantage of legitimate tools like LA Studio to strengthen your original work, and share it with the world through the right channels.
Related: Suno v5.5 Guide 2026 — make free AI songs with your own voice
Frequently Asked Questions
Q. Could this type of AI music fraud happen outside the US?
A. Absolutely. Spotify, Apple Music, and other major platforms operate globally, and the same bot-streaming techniques are technically feasible anywhere. That said, platform monitoring has intensified significantly following this case, and performing rights organizations around the world are keeping a close eye on developments. If you're operating legitimately, there's no need to panic — but it's smart to regularly review your streaming analytics for anything unusual.
Q. Is it illegal to release music made with AI?
A. No — using AI in your creative process is not illegal. What crossed the line in this case was: ① using bots to artificially inflate streams and collect fraudulent royalties, ② training AI on copyrighted material without authorization, and ③ operating fictitious artist identities as part of an organized fraud scheme. Transparently using AI as a tool and distributing your music through legitimate channels is perfectly fine.
Q. What should I do if I notice suspicious stream activity on my account?
A. First, contact your distributor's support team (DistroKid, TuneCore, etc.) and report what you're seeing. Then file a report through the relevant platform's artist support channel (on Spotify, this is through the Spotify for Artists help center). If bot streams are happening without your knowledge or involvement, reporting early gives you the best chance of avoiding account penalties.
Q. Is Spotify's policy of cutting royalties for tracks under 1,000 streams good or bad for indie artists?
A. It's genuinely mixed. On one hand, it helps filter out AI-farmed content and redirects the royalty pool toward artists with real audiences — a net positive for established independents. On the other hand, it removes the ability for early-career artists to earn anything while building their listener base. Most industry analysts see the fraud-reduction benefits as outweighing the downsides long-term, but there's no question it makes the early stages of an artist's career even tougher.
Q. How can I tell if my music is being used without my permission in a fraud scheme?
A. Regularly search your artist and track names across platforms via your distributor dashboard. You can also set up Google Alerts for your artist name, use YouTube Content ID to monitor video usage, and consider a dedicated monitoring service like Soundcharts to track your music across platforms in one place. If you're registered with a performing rights organization, they'll also monitor public usage on your behalf.