A PORTLAND RECORD STORE BOOKED AN AI MUSIC EVENT. THE BACKLASH WAS INSTANT. THE STORE CANCELED TWO DAYS BEFORE.

Music Millennium, one of Portland's most respected independent record stores, canceled a CD release party two days before it was supposed to happen. The reason: the album was made with AI. The musician behind it, Brandon Carmody, has been playing in Portland since 1998. 28 years of music. It did not help.

The album is called "AI". Carmody wrote partial lyrics and melodies, which AI software expanded into full songs. All the vocals are AI-generated. He says a Paul McCartney concert last year inspired him to try making arena-sized music for the first time. He called it astonishing. The software took small bits of songs and turned them into something that sounded like big rock tracks.

THE BACKLASH

Music Millennium posted about the event on social media. The response was immediate and negative. People wrote that they were disappointed. They gave thumbs down. They called it hypocritical for an iconic independent record store to promote robot music.

Owner Terry Currier says his own employees came to him and said they did not support hosting an AI artist. He also heard that customers were planning a protest. Currier canceled on April 11, two days before the event. He said: "This is a little too much. I am not looking to incite any trouble."

Currier remembers a similar reaction when synths and drum machines arrived. History repeats itself. But in 2026, the backlash moves faster. A social media post can turn into a boycott threat in 48 hours. The store did not have time to make a case. It had time to make a decision.

THE MUSICIAN

Carmody is not a random prompt user. He has been a musician in Portland since 1998. He played a CD release at Music Millennium in 2013. He has an actual history in the city and in that store. He wrote partial lyrics and melodies himself. AI expanded them. Tools like Suno and AI Song Maker have made this kind of workflow common among independent musicians who want to hear their ideas at full production scale.

Carmody said he understands that Music Millennium has to stand on the right side. But he is disappointed. He expected to be on his way home from an event, high on the hog, with selfies and sold CDs. Instead he is trying to find other record stores willing to carry the album.

He hesitated when asked whether the controversy would turn into good PR. He said he is still processing it. That is probably the most honest answer anyone has given on this topic in months.

THE PATTERN

This is not an isolated incident. It is a pattern. The data tells the same story from multiple angles.

A Gallup survey conducted in early 2026 found that Gen Z excitement about AI fell from 36 percent to 22 percent year over year. Anger rose from 22 percent to 31 percent. Only 3 percent trust fully AI-generated work. The audience is not warming up. It is cooling down.

The same week Carmody lost his event, Hollywood Reporter published a piece in which Suno CEO Mikey Shulman said producers and songwriters use AI openly. Songwriter Autumn Rowe, with credits on Jon Batiste and Dua Lipa, confirmed it. Harvey Mason Jr. of the Recording Academy says AI is in every studio and every session. Soundbreak launched in Nashville with licensed AI models of real artists. The professional side of the industry has moved on.

But the public side has not. Deezer receives 60,000 AI tracks per day. 85 percent of streams on those tracks are fraud. Spotify removed 75 million spam tracks in the past 12 months. The flood of low-effort AI content has poisoned the well for everyone, including musicians like Carmody who put real creative work into the process.

Carmody got caught in that gap. Professionals use AI behind closed doors. The public rejects it openly. He was honest about AI usage. The album is literally called "AI". That honesty cost him the event.

The question is whether transparency actually works. Apple Music launched AI transparency tags in March. Gallup suggests Gen Z does not trust hidden AI but might tolerate open AI. Carmody was open. It did not help. Voice tools like ElevenLabs and production platforms like Soundverse give independent artists professional-grade output. But professional-grade output means nothing if the audience refuses to listen the moment they see the letters A and I.

WHAT HAPPENS WHEN THE TOOL IS THE PROBLEM

Carmody did everything "right" by the consent model's logic. He was transparent. He used his own creative work as a starting point. He was willing to stand by it publicly. But the store, the employees, and the audience rejected it anyway.

The tool itself has become stigmatized. It does not matter if you used it as a demo assistant or as a full production pipeline. In the public's eyes, AI is AI. A stem separation tool like LALAL.AI can help producers deconstruct and rebuild AI output until it sounds entirely their own. But the label sticks. The moment you say AI, the conversation shifts from sound to ethics.

There is one place where that label does not exist. In a blind listening test, the listener does not know if it is AI or human. The only thing that matters is the sound. Maybe that is where the answer lies. Not in convincing people that AI music is acceptable, but in letting the music speak for itself without the label.

At VoteMyAI, tracks are rated blind. No artist name, no tool name, no marketing copy. Just the music and the listener. When trust in AI work is sitting at 3 percent, the only number that still means anything is what happens when a song has to earn its ears on sound alone. For the full set of creator tools, see the resources page.