Imagine a fiery collab between Drake, Jay-Z, and Eminem, orchestrated not by record labels and agents but by artificial intelligence. We are there. AI can now mimic an artist’s sound and style, raising both amazement and eyebrows. Already a few AI-generated songs have gotten attention because of the AI’s uncanny ability to replicate the sound and style of Jay-Zand Drake (maybe not so much for Eminem).

No surprise, copyright takedown notices have already been delivered. But copyright laws—which requires that human creators contribute a degree of creativity in order to be considered an author and to establish a copyright in the work—make the infringement claims a bit complicated. The rappers didn’t compose, perform, or contribute any creativity to the AI songs themselves and since they didn’t lend any creative input, they can’t be considered authors of the work, entitled to have it taken down. And under current law the works aren’t even derivative. The only thing the artists could be said to contribute to the songs is their voice (and maybe style). But there are cases holding that the voices within a copyrighted work are not copyrightable material. And outside of specific works, there is no copyright to one’s voice.

(Further complicating things, it’s not clear if these works have any author at all. AI can’t hold authorship. Only humans can. So it’s not clear that the person who generated the songs could bring an infringement claim for subsequent ripoffs.)

So the record labels’ claims of copyright infringement may be reaching here, considering that the rappers themselves, not being recognized as the authors, seemingly hold no rights to the works. For now. But courts may need to adapt to address the new technology. Sam Altman believes that when users ask a model for art inspired by a specific artist, it should be viewed as a derivative work, meaning the original artist should get a slice of the pie. The record labels also argue that AI companies violate copyright by using the artists’ songs in training data. Several lawsuits have been filed to address these issues, but predicting the verdicts is akin to reading tea leaves at this point.

While copyright law might not help artists (yet), state privacy laws probably will. Specifically, the right to publicity, which protects the commercial use of an individual’s identity, their voice. Essentially, the right to publicity prevents others from commercially exploiting your name, likeness, or other recognizable aspects without your consent.

Consider this: securing a Jay-Z verse could launch a new artist’s career (he is the GOAT, after all). [ed: ehhhh] But what if AI eliminates the need for his involvement? If a young rapper used AI to debut with a Jay-Z guest verse, would that infringe on Jay-Z’s right to publicity?

Thanks to Bette Midler, we may have an answer. The award-winning singer, actress, and comedian, has been a prominent figure in the entertainment industry since the 1970s, which is why the Ford Motor Company wanted her recognizable voice in their commercials. But apparently, Bette’s a Chevy girl because she turned them down. Not to be deterred, Ford hired a sound-alike singer in a commercial without her permission. Midler sued, asserting that the unauthorized commercial use of her voice violated her right to publicity. The court agreed, marking a pivotal point in recognizing a voice as a crucial part of identity. This case may hold significant implications for AI-mimicked voices. There doesn’t appear to be a relevant distinction between a sound-alike person and a sound-alike AI.

So, does AI-generated content infringe on the right to publicity? If done without consent for commercial benefit, the answer seems to be yes. Especially if the message attributes false or misleading information, which places them in an offensive light. While technological advances often outpace legal systems—a theme you’ll see often here—the increasing sophistication of AI-generated content is likely to further challenge the boundaries of intellectual property law. The takeaway: don’t take someone’s voice without permission, or you might end up paying in Dead Presidents.