AI music fraud scheme nets millions
A man has admitted orchestrating a large-scale fraud that used artificial intelligence to generate songs and automated bots to stream them on digital platforms, earning more than $8 million in illicit royalties, according to prosecutors. Michael Smith, 52, entered a guilty plea in a United States federal court to charges linked to wire fraud and money laundering after authorities uncovered what they described as one of the […]The article AI music fraud scheme nets millions appeared first on Arabian Post.
Michael Smith, 52, entered a guilty plea in a United States federal court to charges linked to wire fraud and money laundering after authorities uncovered what they described as one of the most sophisticated streaming manipulation schemes to date. Investigators said the operation involved producing vast libraries of AI-generated tracks and inflating their play counts through coordinated bot activity, exploiting payment systems used by major music streaming services.
Prosecutors alleged that Smith’s scheme relied on creating tens of thousands of songs using generative AI tools capable of producing instrumental tracks at scale. These compositions were then uploaded under fictitious artist identities to streaming platforms, where automated accounts were programmed to repeatedly play the songs around the clock. The activity, they said, was designed to mimic legitimate listening behaviour while maximising royalty payouts tied to stream counts.
Federal officials noted that the operation ran for several years and generated billions of artificial streams. Each stream contributed fractions of a cent in royalties, but the cumulative effect translated into millions of dollars in payments distributed through rights management systems. Authorities stressed that while the content and listeners were fabricated, the financial gains represented a direct loss to the broader music ecosystem, including legitimate artists and rights holders.
“Although the songs and listeners were fake, the millions of dollars Smith stole was real,” a prosecutor said during court proceedings, underscoring the scale and impact of the fraud. The case forms part of a wider crackdown on streaming manipulation, an issue that has increasingly drawn scrutiny from regulators and industry groups as digital platforms dominate music consumption.
The investigation revealed that Smith used networks of servers and automated scripts to simulate human listening patterns, including varying play times, geographic locations and listening intervals. By dispersing the activity across multiple accounts and regions, the operation avoided triggering immediate detection by platform safeguards designed to identify unusual streaming behaviour.
Industry analysts say the case highlights a growing vulnerability at the intersection of artificial intelligence and digital distribution. Generative AI has lowered the barrier to producing large volumes of audio content, raising concerns about how easily such tools can be misused to exploit royalty systems that were designed for human-created works. Streaming platforms typically pool subscription and advertising revenues and distribute them based on the proportion of total plays, meaning fraudulent streams can dilute earnings for legitimate artists.
Music industry bodies have long warned about “streaming farms” and bot-driven manipulation, but the integration of AI-generated music marks a new phase in the challenge. Unlike traditional fraud schemes that recycle existing tracks, AI allows perpetrators to create entirely new catalogues that are harder to trace and flag as duplicates. This complicates enforcement efforts and places additional pressure on platforms to refine detection technologies.
Executives at major streaming services have acknowledged the risks and have introduced measures such as stricter account verification, machine learning tools to identify anomalous streaming patterns and penalties for distributors found to be facilitating fraudulent activity. However, experts argue that enforcement remains uneven and that the rapid evolution of AI tools continues to outpace safeguards.
Legal experts note that the case against Smith could set an important precedent as courts grapple with how existing fraud and intellectual property laws apply to AI-generated content. While creating music with AI is not inherently illegal, using it to manipulate financial systems or deceive platforms falls squarely within established fraud statutes. The guilty plea signals that authorities are prepared to pursue such cases aggressively.
The broader implications extend beyond music. Similar techniques could be deployed across other digital ecosystems that rely on engagement metrics, including video streaming, advertising networks and social media platforms. Analysts warn that as generative technologies become more accessible, the potential for large-scale automated fraud will expand unless countermeasures keep pace.
For artists and rights holders, the case underscores longstanding concerns about fairness in royalty distribution. Many have argued that current payout models already favour high-volume streaming and can disadvantage independent creators. The injection of artificial streams exacerbates those imbalances, diverting revenue away from genuine performers.
The article AI music fraud scheme nets millions appeared first on Arabian Post.
What's Your Reaction?



