Picture this: your voice, your unique sound that you've poured your heart into for years, suddenly appearing in a chart-topping hit that the world thinks is all yours—but it's not. It's an AI-generated clone, and you're not getting a dime. That's the jaw-dropping situation British singer Jorja Smith is caught in, thanks to a viral TikTok sensation called 'I Run' by the dance act Haven. And this is just the tip of the iceberg in the wild world of AI in music. But here's where it gets controversial—should creators like Smith have a say in how their art is twisted by algorithms? Let's dive in and unpack this story step by step, breaking down the details so even if you're new to the music industry, you'll follow along easily.
In October, 'I Run' exploded on TikTok, skyrocketing to number 11 on Spotify's US chart and number 25 globally. It seemed poised for big things in the UK and US charts too. But then, streaming platforms like Spotify yanked it off after receiving takedown notices from Smith's label, Famm. Why? The Recording Industry Association of America (RIAA) and the International Federation of the Phonographic Industry (IFPI) argued it violated copyright by mimicking Smith's voice and tricking fans into thinking it was an unreleased track by her. For beginners, copyright infringement basically means using someone else's creative work without permission, like borrowing a melody or voice without asking—and in this case, it involved misleading the public about who was really singing.
Haven's social media buzzed with posts hashtagged #jorjasmith, and a video purportedly showed rapper Offset from Migos spinning the track at a New York Boiler Room event, fueling rumors it was Smith's secret new song. Spoiler: that footage was fabricated. The female vocals on the original version weren't credited—and guess what? They came from the band's Harrison Walker, manipulated through Suno's generative AI tool. In his own words to Billboard, Walker admitted, 'It shouldn’t be any secret that I used AI-assisted processing to transform solely my voice for I Run. As a songwriter and producer, I enjoy using new tools, techniques, and staying on the cutting edge of what’s happening.' It's like taking a basic voice recording and running it through software that clones famous tones—like how a beginner artist might use filters on a phone app to sound like a pro.
This isn't the first AI music drama. Big players like Sony Music, Warner Music Group, and Universal Music Group have sued Suno for allegedly training their AI models on copyrighted songs without consent. Suno defended itself by claiming fair use, which in copyright law means using parts of protected works for purposes like commentary or education without needing permission—but it's often a gray area, and courts decide on a case-by-case basis. Warner eventually settled with Suno (as detailed in The Guardian's report from November 26, 2024), while Sony and Universal are still battling it out in court. And this is the part most people miss: these lawsuits highlight how AI companies scrape vast amounts of music data to build their tools, potentially devaluing human creators if not regulated.
Famm, Smith's label, shared a statement on Instagram that revealed even more drama. Once 'I Run' blew up, Haven approached Smith about collaborating on a remix, apparently to 'legitimize' the track since fans believed it featured her vocals. But Famm suspected AI was involved all along, and they turned down what could have been a quick backroom payout. 'We could have cut a cheque and gotten paid, but we ignored the request,' they said. The Guardian reached out to Haven for comment, but as of now, their side of the story remains unheard.
Not one to give up, Haven rerecorded 'I Run' with real vocals by Kaitlin Aragon, and it climbed to number 37 on the UK charts last week. Yet Famm insists the melody itself was crafted using AI trained on Smith's discography. In another Instagram post, they argued, 'Haven and his team have now replaced the AI vocal with a real human vocal, although we still believe both versions of the track infringe on Jorja’s rights and unfairly take advantage of the work of all the songwriters with whom she collaborates.' If they win a share of the royalties, Famm plans to split it proportionally among Smith's co-writers, based on their contributions to her catalog—because if AI helped create 'I Run,' it drew from her entire body of work. Think of it like this: if someone uses your old photos to train an AI app that generates new images, should you get credit and cash for every output?
The song lingered on platforms thanks to four aggressive distributors who reportedly bypassed standard takedown processes. Spotify, for its part, spotted the impersonation, removed it, and withheld any payments. Billboard reserved the right to pull tracks in active copyright disputes. Famm blasted Haven and their allies for using 'public confusion as a key part of the marketing strategy,' claiming they should've clarified Smith's vocals weren't involved but instead let the mystery simmer. Then, when doubts surfaced about whether it was AI, they fueled more chaos: 'Is this an AI track? Are these AI vocals? Again, rather than clear up the confusion immediately, they allowed the storm to brew.' And here's another layer of intrigue— a longtime female musician named Haven, who has been making music for years, got dragged into the mess by online trolls accusing her of being fake because of the name overlap. She posted an Instagram video titled 'Story time of how I got caught up in this AI mess' and even released her own 'Human Haven' cover of 'I Run.' Will the 1990s Cornish band Haven chime in? Only time will tell.
Famm emphasized this isn't just about Smith—it's a broader issue. 'This isn’t about Jorja. It’s bigger than one artist or one song.' They urge clear labeling for AI-generated music so consumers can decide what to listen to, and call for credit and compensation for artists whose work trains AI models. They describe creators as 'collateral damage in the race by governments and corporations towards AI dominance,' pointing to 'I Run' as proof we need regulations now, before it's too late. To put this in perspective, recall mid-November 2024, when three AI-crafted tracks dominated charts: 'Walk My Walk' and 'Livin’ on Borrowed Time' by Breaking Rust topped Spotify's US Viral 50, while Dutch artist JW 'Broken Veteran's' anti-migrant song 'We Say No, No, No to an Asylum Seekers’ Centre' hit the global viral chart. Back in July, 'The Velvet Sundown' racked up over a million Spotify plays before being exposed as fully AI-made, as reported in The Guardian.
So, what do you think? Should AI music always be transparently labeled, or is it fair game in the name of innovation? Do artists deserve royalties from cloned versions of their voices, even if indirect? And here's a controversial twist: some argue that AI democratizes music creation, letting everyday people experiment without the barriers of fame or skill—but at what cost to original artists? Share your opinions in the comments below; do you side with labels pushing for guardrails, or do you see this as an exciting evolution in music?