Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

Taylor Swift has been in the middle about AI simulation of conflicts for years, and now, he has become the latest celebrity to increasingly try to protect himself from AI copycats. As usual, however, the legal system intersects with technology in complex ways – and Swift’s efforts may be a long way off.
In a letter sent last week, Swift’s team asked for protection for two words the singer made: “Hey, it’s Taylor Swift” and “Hey, it’s Taylor.” The files, filed by TAS Rights Management on Swift’s behalf, include audio of Swift saying the two words as part of the promotion for her latest single. “Hey, it’s Taylor Swift, and you can listen to my new album The Life of a Showgirl on demand on Amazon Music Unlimited,” Swift said in one of the recordings. TAS Rights Management filed a trade mark for Swift’s photo as well, which shows the singer “holding a pink guitar, with a black belt and wearing a multi-colored dress with silver shoes” on stage.
While the Swift team has never said that the tokens are intended to protect against AI abuse, it seems likely given Swift’s history with AI. Not only did the star face the AI music challenge, but Swift also struggled with sex attack AI deepfakes.
Artists have long used copyright law to protect their music, but the rise of AI-generated music has made protecting their works and designs very difficult. It’s because copyright only protects the artist’s music – not their words. Legal teams had to get a little creative, when Universal Music Group (UMG) filed a petition to revoke ownership of Drake’s AI-produced song, citing producer Metro Boomin’s trademark which plays at the beginning.
As mentioned IP attorney Josh Gerbensignals can help fill the gap created by AI-generated simulations. Instead of just copying her own music, Swift “can criticize not only similar songs, but also ‘disturbingly similar,'” said Gerben. A Swift image, similarly, can be used to take action against similar AI-generated images. Earlier this year, Matthew McConaughey received the same codes for its videos, including when it says “Okay, okay, okay,” to protect against AI abuse.
But Alexandra Roberts, a professor of law and journalism at Northeastern University, says Seaside that he “suspects” that the audio text posted by the Swift team “indicates its use as a token, rather than text that is included as part of a longer message”:
Often when it comes to audio we can think of something like NBC’s roar or MGM’s lion roar that plays at the beginning of every show or movie individually…
Swift’s notes could serve as another legal weapon in its arsenal against AI-generated copycats, even if they are not legally binding. Xiyin Tang, a law professor at the University of California, Los Angeles, says Seaside that signs can help “warn violators by pointing them to the state registration number and registration certificate in the hope that this will convince them to stop, not because the state registration will remain in court.”
There are already other steps the Swift team can take, including the right to declaratory laws established in several jurisdictions, which allow individuals to take action against misuse of their name or image. Artists can deal with false advertising and endorsements through federal law, too. “Swift also has a lot of trademarks for her name, so she could sue for trademark infringement if she uses her name with someone else that creates confusion,” says Roberts.
So far, only Tennessee has passed a law that specifically covers AI-generated copycats of artists’ lyrics. Although YouTube depth detection toolwhich gives celebrities, politicians, journalists, and designers the power to download AI-generated facial expressions, it only works for people who copy their faces right now. In the absence of a broader framework for AI soundalikes, artists like Swift can only hope that the identity law will help protect against emerging AIs imitating not just their faces, but their voices.