An AI-generated country song reached No. 1 on Billboard's Country Digital Song Sales chart last fall. The artist was called Breaking Rust. There was a cowboy persona, an Instagram page, and lyrics about perseverance. There was no human behind it. Most listeners had no idea.
That track, "Walk My Walk," accumulated over 2 million monthly listeners on Spotify, where it appeared as a verified artist with no biography. Several songs crossed 1 million streams. One exceeded 4.5 million. The creator never publicly identified themselves.
This week, Apple Music announced it is doing something about this. On March 4, the company sent a newsletter to industry partners announcing Transparency Tags — a new metadata framework that covers four content categories: Track, Composition, Artwork, and Music Video. Labels and distributors can apply the tags immediately. Apple says they will eventually become mandatory for new content.
THE SCALE OF THE PROBLEM
The Breaking Rust story is not an outlier. It is a preview.
Deezer, which has invested in its own AI detection infrastructure, reported in January that it receives over 60,000 fully AI-generated tracks every single day — up from 10,000 when it first deployed its detection tool in early 2025. Synthetic content now makes up roughly 39% of all music delivered to the platform daily.
More striking: Deezer found that up to 85% of streams on AI-generated music in 2025 were fraudulent — used to game royalty payouts rather than reflect genuine listener demand.
Suno generates a Spotify catalog's worth of music every two weeks. The technical barriers to creating and distributing AI music have effectively collapsed. Tools like Suno and Udio allow anyone to generate a full track from a text prompt, upload it through standard distribution services, and place it in streaming catalogs alongside human-made work — with identical metadata fields and monetization rights.
WHAT APPLE'S TRANSPARENCY TAGS ACTUALLY DO
Apple's system covers four categories:
- Track — the audio itself was AI-generated
- Composition — the lyrics or melody were AI-generated
- Artwork — the cover art was AI-generated
- Music Video — the video was AI-generated
Labels and distributors can apply these tags now. The catch: it is entirely voluntary. If the tags are omitted, none is assumed. There is no independent verification, no detection layer, and no enforcement mechanism for labels or distributors that choose not to disclose.
Apple framed the initiative as "a concrete first step toward the transparency necessary for the industry to establish best practices and policies that work for everyone." But the parties being asked to volunteer disclosure are the same parties who benefit from non-disclosure.
Spotify is taking a similar path. Co-CEO Gustav Söderström said at the company's February 10 earnings call that Spotify should not police creative tools, but acknowledged listener demand: "What we do think is that consumers would like to know and understand what tools were used in the creation of their music."
Deezer is taking a different approach — building its own AI detection infrastructure rather than relying on self-reporting. The challenge is accuracy. Detection tools are improving but remain imperfect, especially as generation models get better at producing outputs that are indistinguishable from human recordings.
THE BLIND RATING ALTERNATIVE
Here is the part the industry keeps missing.
Breaking Rust reached No. 1 not because listeners sought out AI music. It reached No. 1 because the discovery layer does not distinguish based on origin. Listeners encountered the track, it sounded good enough, and it got played. The AI label was never part of the equation.
That is both the problem and the argument for a different approach.
At VoteMyAI, every track is rated blind. No artist name. No follower count. No tool label. No information about whether the track was made by a bedroom producer in Norway or a text prompt in 30 seconds. The music gets judged on what it actually sounds like.
The result: tracks that would never surface through algorithmic or social discovery get rated on their actual merit. Some score well. Some score poorly. The label does not determine the outcome — the music does.
Apple's Transparency Tags solve a disclosure problem. Blind ratings solve a discovery problem. Both matter. But if the goal is finding good music regardless of origin, disclosure alone does not get you there.
The 4,000+ ratings on VoteMyAI already show something the Breaking Rust story confirms: listeners will engage with AI music when they do not know it is AI music. The question is not whether they should. The question is what happens when the best of it finally gets found.
Rate AI music blind at VoteMyAI
RATE AI MUSIC BLIND
4,000+ ratings from real listeners. No labels, no bias — just the music. See what scores highest when nobody knows the origin.
Explore VoteMyAI →