AI and its Need for Regulation in the Music Industry

Benjamin Madoff

30 December 2024

Artificial Intelligence (AI) has been associated with the future of technology, and generally the future of humanity. This, however, has led to great uncertainty as to the impact the influx of AI will have considering it is a mostly unknown variable ] is evolving every day. This technology is infiltrating every facet of future developments such as medicine, transportation, and cell phones. It was only a matter of time before  AI began to assimilate into entertainment, and one of the industries  most challenged is music. AI-generated music is rampant on social media platforms such as TikTok, Instagram, and X. The problem arises when AI causes conflict in many aspects of society relating to media, such as with media copyright laws, moral arguments regarding art, and people’s careers in the music industry. Therefore, AI needs to be restricted in the production of music until a legal framework has been established for its regulation because AI confuses copyright and ownership systems, raises ethical conflicts with musical artists, and has the potential to destroy people’s jobs despite performing poorly at the tasks that would displace human labor.

First, AI in the production of music needs to be restrained while a policy system tailored for AI becomes established as AI’s ability to confuse copyright and ownership systems results in theft and loss of income. Traditionally, copyright law grants the lyricists, musicians, and composers the rights to the music. Nevertheless, when AI generates its own music or contributes to the creative process, it’s hard to discern who owns the rights to the final product. Music labels have argued that they own the rights because they have the rights to the use of the AI through contractual agreements or other methods of claiming ownership of the work that the AI creates. With the way things are now, there will always be a gray area when it comes to the use of AI due to how new AI is as a technology and how many people lack a comprehensive understanding of it. Even with an understanding of AI, the policy surrounding it is either outdated or does not exist yet.

Due to the proper policies not being put in place, there is a lot of potential for stealing and copying people’s music. The harm of stealing someone else’s music cannot be understated. Artists put  blood, sweat, and tears into breathing new music into the space not only to express themselves artistically but also to financially support themselves. Stealing an artist’s music robs them of the fulfillment they may have otherwise received while also introducing an unnecessary struggle in their day-to-day lives. Regarding this gray area, there is currently a large-scale lawsuit being brought against certain AI-music companies. The Recording Industry Association of America (RIAA)– which includes huge music labels like Sony Music Entertainment, Warner Records Inc, and Universal Music Group UMG Recordings–  suing Suno and Udio, two AI music generator companies that develop music-making AI. The RIAA alleged these companies used copyrighted music to train their AI models without authorization. The companies did not deny these allegations; they simply deflected by asserting “fair use” (which is the doctrine under U.S. copyright law that states that using limited portions of a work for a variety of purposes is legal) [1] and that their methods are “trade secrets,” which are defenses typically used when copyright infringement has occurred. Plenty of evidence against the companies was presented, such as when a specific prompt “ pop punk American alternative rock California 2004 rob Cavallo” was entered in Udio and resulted in a file that heavily resembled Green Day’s “American Idiot” [2]. If the infringement was deliberate, the RIAA had sued for up to $150,000 in statutory damage per song infringed. It is alleged that Suno copied 662 songs and Udio copied 1,670 songs [3] resulting in around $99,300,000 in damage for Suno and about $250,500,000 of dollars for damages. It is incredibly easy for companies, let alone individuals, to get away with stealing others’ work to incorporate into their own AI technology. The RIAA suit is a much more high-profile infringement, but not the only one since these infringements occur daily. 

Currently, a bill is being moved through Congress which may help remedy the ownership issue, but the bill is still in its early stages. On the Congressional level, the NO FAKES Act of 2024 (S. 4875) was read twice to the Senate on July 31st, 2024 and referred to the Senate Judiciary Committee. This bill would serve to protect the “intellectual property rights in the voice and visual likeness of individuals, and for other purposes” [4]. This bill refers to sound recordings of recorded music, podcasts, images, or any other kind of digital production in which the voice is used.  The NO FAKES Act is sponsored by Delaware Senator Christopher Coons and was cosponsored by three other senators: Marsha Blackburn of Tennessee, Amy Klobuchar of Minnesota, and Thomas Tillis of North Carolina. The enactment of this bill would serve to protect the names and likenesses of individuals from generative AI. There have been numerous instances of people using AI-replicated voices of popular artists such as Drake and The Weeknd to profit off of Spotify, Youtube, and more to profit from the streams and views without the consent of the artists [5]. Coons states that this bill will work towards preventing this sort of problem and hold those who exploit this technology liable for the confusion and damages they may cause. With this egregious level of piracy, music AI needs to be restricted until it becomes easier to detect when music has been stolen for training AI. Such regulation is necessary to make it more difficult for people to steal music for AI both for the artists to receive credit and for the music industry to remain a safer  and more fair environment.

Next, music AI technology needs to be more heavily monitored before an adequate system has been put in place for AI because AI conflicts ethically with many musicians. Musicians, as well as other content creators, believe that AI could take over the entertainment industry one day. Not necessarily akin to Terminator and Skynet, but in a way that renders human musicians obsolete due to the efficiency and accessibility AI possesses. Currently, these songs are not particularly well-made relative to human songs, but it’s certainly a future to consider given how exponentially AI is evolving every day. This scenario also introduces important ethical concerns, such as whether AI lacks the emotion and sincerity that humans can draw upon in production. Part of understanding and absorbing the music is by understanding not only the story the song crafts, but also the meaning behind the artist and their response to specific circumstances. Many if not all songs cover one of a variety of genuine feelings or struggles humans undergo in regards to themselves, others, or society at large. Anyone with malicious intentions could fake the audio of any popular figure, whether it be them saying something incredibly offensive or creating fake songs and profiting off of it. Some might even have a problem with specifically who uses the voices and which ones. For example, some were  upset over Paul McCartney using AI to include John Lennon’s voice in a Beatles track and believed such an action was insensitive [6]. The problem is that even though AI music-production technology is not fully able to replicate or express human emotion, it is incredibly difficult for most to differentiate AI music from real music. A study conducted by AI music generator company Amper found that users “‘could tell no discernible deference to either, signaling a major shift for the future of music-making and content creation….’” [7] While it may hypothetically be possible for those with astute musical knowledge to differentiate what is real and what is fake, most people hear music as simply music whether it be well or poorly received. There is a difference between AI and most human-crafted music, but most people do not know what to look for which makes the lack of differentiation incredibly problematic. Real musicians’ works are being mixed in with AI music and in some instances, AI music may be preferred to real musicians. who put  extensive time and effort into their pieces. The onset of AI is being set up to infiltrate the music industry and make it more commercialized, removing the originality and heart creators pump into the veins of the music industry. Given how AI does not bring substance to the field while also ruining the integrity of many artists who simply want to make their own music, AI should be intensely watched to ensure musicians can still provide uniqueness and the human element to music as a whole. For example, The U.S. Federal Trade Commission’s (FTC) Office of Technology and Consumer Financial Protection Bureau have begun work on learning and understanding algorithmic systems to better implement regulations, particularly when policymakers pass legislation. Most of these regulations are working towards making data and algorithms more transparent for businesses and the public to access, which would effectively restrict the obscurity and veil of unknown AI wields [8]. Making the workings of AI more transparent can promote a separation between real artists and technology, allowing people to value the works of artists more. Also, if AI becomes much more transparent and regulated, AI stealing music will become much more noticeable, preventing much more music theft and returning passion and creativity to the medium.

Lastly, AI needs to be regulated and restricted because it has the potential to affect people’s jobs while also not performing well at them. AI in general has been thought of as taking over people’s jobs, whether it be for software engineering, accounting, marketing, or practically any job. While it may be more assuring to think of this not being the case, fourteen percent of workers have experienced job displacement due to AI [9]. AI has taken over the role of people who analyze data to identify target audiences, create marketing campaigns, target ads and promotions, and more. While in an ideal world, this would make the process of promoting music more optimal and efficient, the reality is that AI has the very real potential to incorrectly interpret and organize data that a human would otherwise interpret and organize accurately. This may result in inaccurate marketing, which could cost artists thousands of potential listeners. For example, once the AI begins the process of marketing towards the wrong demographic, it would be incredibly difficult to change course without retroactively destroying their own advertisements. Also given the lack of a human element in AI, the majority of the marketing may come off as similar or even worse, appear as if there was little to no effort put into convincing people to listen to their music (which in this case would be true). The real reason why AI has become such a threat is that people view it as a cheaper, more efficient alternative to humans without considering how creatively impotent AI is. As music artists and their marketing teams use AI more and more, musicians and advertisers are being flushed out of their own industry. The displacement of musicians and advertisers has clearly dangerous ramifications for artists and listeners alike. Regular listeners may stray from their favorite artists if the artist transitions into using more AI technology. New listeners may also steer clear if the marketing appears AI-like. AI currently is not evolved enough to fully complete complex technical tasks, but business executives and those in charge of creating media still rely on AI over real people because they do not know any better. Thus, real jobs are taken away and replaced by entities that cannot fulfill those roles.

The impact AI could have on the music market cannot be overstated and needs to be controlled before it gets out of hand and causes too many irreversible issues in the economy. Streaming services such as Spotify have released their own versions of AI-generated playlists and since the computers that generate these playlists do not require royalties, they might be pushed a lot harder than real artists who are trying to produce music to make ends meet [10]. This can lead to a future where companies no longer pay real artists for their music and simply rely on AI to generate all of their revenue. While this may seem like a distant possibility, it is a possibility nonetheless and needs to be considered moving forward. There are multiple more bills regarding AI in Congress, but they are also moving slowly. 

The most important bills currently in Congress are the National AI Commission Act and the AI Disclosure Act of 2023. The National AI Commission Act would create a commission solely focused on AI, which would  conduct its work and regulations to  mitigate the risks and possible harms of AISuch oversight is what the future of AI desperately needs to maintain order and safety with regards to AI. Keeping all eyes on the technology and responding appropriately to any changes it may try to bring is the policy that needs to be enforced moving forward. It has ten cosponsors and was referred to the House but has seen no motion since June of 2023 [11]. In alignment, The AI Disclosure Act of 2023 would “require generative artificial intelligence to disclose that their output has been generated by artificial intelligence, and for other purposes.” It has one cosponsor and was last seen moved to the House in early June 2023[12].  The AI Disclosure Act advocates for and would enable AI transparency, which would work hand in hand with the efforts of U.S. agencies mentioned previously to better understand and analyze the algorithms and impact this technology will have on the global landscape. In conclusion, these two bills– the National AI Commission Act and the AI Disclosure Act would give artists more breathing room to navigate AI as a tool without the risk of being robbed or taken advantage of. It would also allow legislators to become more informed on AI and propose bills accordingly.. The problem is, though, that these bills are not seeing any action and have essentially been left to dust in the corner. 

In conclusion, the introduction of AI into the music industry marks a paradigm shift that society simply is not ready for. AI has been spreading throughout society and has become part of daily use for a lot of students, office workers, and even artists of all types. However, AI needs to be regulated and restricted until Congress can establish a legal framework around it because of the various legal, moral, and employment problems that many have with AI as a tool. AI is essentially the Wild West in terms of both regulation and potential, as right now the future of technology and humanity seems to be unknown, while also in the hands of AI. Until bills can successfully make their way through Congress as well as impact the economy, and until society finally accepts AI into more aspects of life, AI cannot be left to its own devices because it might destroy the future by trying to reach it too quickly. AI is like a toddler running around a playroom of stacks of cards, and it needs to learn how to play nicely before it grows up.


Image via Pexels Free Photos

Works Cited

[1] Office, U.S. Copyright. “Fair Use (FAQ): U.S. Copyright Office.” Fair Use (FAQ) | U.S. Copyright Office. Accessed November 18, 2024. https://www.copyright.gov/help/faq/faq-fairuse.html#:~:text=Under%20the%20fair%20use%20doctrine,news%20reporting%2C%20and%20scholarly%20reports

[2] Dhameliya, Savan. “Savan Dhameliya.” IPRMENTLAW, July 13, 2024. https://iprmentlaw.com/2024/07/13/record-labels-sue-ai-platforms-making-music-udio-v-universal-sony-warner/

[3] Co., Naik Naik &. “AI & the Music Industry; Copyright Infringement & Creative Overlaps.” Naik Naik, July 3, 2024. https://naiknaik.com/2024/07/03/ai-the-music-industry-copyright-infringement-creative-overlaps/.

[4] S.4875 – 118th Congress (2023-2024): No Fakes Act of 2024 | congress.gov | library of Congress. Accessed November 8, 2024. https://www.congress.gov/bill/118th-congress/senate-bill/4875

[5] “Download: U.S. Senator Christopher Coons of Delaware.” Downloads, July 31, 2024. https://www.coons.senate.gov/download/no-fakes-act-one-pager

[6] Oğul, Sertaç. “In Tune with Ethics: Responsible Artificial Intelligence and the Music Industry.” OECD.AI. Accessed November 7, 2024. https://oecd.ai/en/wonk/ethics-music-industry

[7] Forde, Eamonn. “Can Humans Tell the Difference between AI Music and Music Made by People?” Music Ally, May 11, 2023. https://musically.com/2019/05/30/can-humans-tell-the-difference-between-ai-music-and-music-made-by-people/

[8] Engler, Alex. “The AI Regulatory Toolbox: How Governments Can Discover Algorithmic Harms.” Brookings, October 9, 2023. https://www.brookings.edu/articles/the-ai-regulatory-toolbox-how-governments-can-discover-algorithmic-harms/

[9] Dahlin, Eric. “Are Robots Really Stealing Our Jobs? Perception versus Experience.” Socius: Sociological Research for a Dynamic World 8 (January 2022). https://doi.org/10.1177/23780231221131377

[10] Sefa, Flavio. Creative Destruction and the Music Industry, 2020. https://uia.brage.unit.no/uia-xmlui/bitstream/handle/11250/2682488/Flavio%20Sefa.pdf?sequence=1

[11] “H.R.4223 – National AI Commission Act.” Congress.gov. Accessed November 8, 2024. https://www.congress.gov/bill/118th-congress/house-bill/4223/text?s=3&r=18&q=%7B%22search%22%3A%22Artificial+Intelligence%22%7D

[12] “Text – H.R.3831 – 118th Congress (2023-2024): Ai Disclosure Act of 2023 | Congress.Gov | Library of Congress.” Congress.gov. Accessed November 8, 2024. https://www.congress.gov/bill/118th-congress/house-bill/3831/text.

Leave a comment