Here is a prediction for 2025: there will even be more lawsuits for unlicensed use of content to train AI models.
In the music industry, we have already seen lawsuits filed by major recorded music companies and publishers against companies such as AI generation startups Suno and Udio and AI giants Anthropic and Open AI.
In the broader world of media and entertainment, OpenAI has also been pursued by the New York Times and authors including Sarah Silverman. Getty Images, for its part, continues Stability AI for allegedly using his photos without permission and the Dow Jones and the New York Post are continue Perplexity for allegedly copying their work without a license.
The type of content at issue in all of these lawsuits may be different, but the allegations are largely similar: AI companies are using copyrighted material to train their systems without authorization.
A The key argument made by some AI companies and their backers in response to allegations of infringement is that training AI using copyrighted content available on the Internet is a “fair use” under copyright law.
In a scathing editorial published by Fortune this week, the CEO of photography agency Getty Images, Craig Peters, refuted that argument, instead advocating a nuanced approach to assessing fair use, supporting AI’s potential for societal good without harming industries creative – including music.
As Peters notes in the column for FortuneGetty employs more than 1,700 people and represents the work of more than 600,000 journalists and creators around the world.
The company generated $240.5 million in the third quarter (the three months through the end of September) up 4.9% year-over-year and expects its fiscal 2024 revenue to be between $934 million and $943 million.
“Copyright is at the very heart of our business and the livelihood of those we employ and represent. »
Craig Peters, Getty, in an op-ed for Fortune
“Copyright is at the very heart of our business and the livelihood of those we employ and represent,” Peters writes.
He says he “vigorously disagrees with the radical position expounded” by figures like Mustafa Suleyman, CEO of Microsoft AI, who, Peters writes, has made remarks in the past which suggest that there is “no copyright protection for online content”.
Peters adds: “This disagreement highlights why we are taking legal action against Stability AI in the US and UK. We have not granted Stability AI permission to use millions of images owned and/or represented by Getty Images to train their Stability AI model which was made commercially available starting August 2022. »
Peters notes that “as litigation slowly progresses, AI companies are making the argument that there will be no AI without the ability to freely retrieve content for training, resulting in our inability to leveraging the promise of AI to solve cancer, mitigate global climate change. and eradicate world hunger.
He adds: “Note that companies investing in and building AI spend billions of dollars on talent, GPUs, and the power needed to train and run these models – but remarkably claim that compensation for content owners is an insurmountable challenge. »
Peters also argues that “fair use” “should be applied on a case-by-case basis” and that AI should not be seen as “a monolithic case”, but should be treated as “a broad range of models, capabilities and potential applications.
He asks in the editorial: “Does curing cancer impact the value of Kevin Bacon’s performances?” Clearly no. Does solving climate change impact the value of Billie Eilish’s music? Clearly no.
“Does Solving World Hunger Impact the Value of Stephen King’s Writings?” Again, clearly no. Not only does this not undermine the value of their work, but they would probably never challenge such use if it could further their purposes, even if such use might be commercial in nature. As CEO of Getty Images, I can say that we will never debate or challenge these applications and wholeheartedly welcome any support we may offer for these applications.
Peters further argues that “content generation models” that generate “music, photos, and videos based on text or other input” and that have been “trained on artists’ content without their authorization” do not have “the potential to raise the level of excellence”. our societal results.”
He says this use of AI is “pure theft from one group for the benefit of another.”
“Let’s stop the rhetoric that all unauthorized AI training is legal and that any demands to respect the rights of creators come at the expense of AI as a technology.”
Craig Peters, Getty
The Getty CEO also suggests parallels between the AI content industry’s evolution toward the emergence of licensed players in that space and the rise and fall of Napster and other illegal download services, which have given way to licensed streaming platforms like Spotify.
“Like licensed designs from Spotify And Apple Music has evolved from the original counterfeit Napster, there are AI models developed with permission, and with business models that reward creators for their contributions,” writes Peters in his op-ed for Fortune.
He adds: “Like Apple Music and Spotify, they will cost a little more, but they can thrive and be widely adopted if we create a level playing field by catering to companies that choose to “move fast and break things.” “, in this case. , violate established copyright law.
Peters’ article ends with the argument that “there is a fair path that rewards creativity and delivers on the promise of AI.”
He adds: “Let’s stop the rhetoric that all unauthorized AI training is legal and that any demand to respect the rights of creators comes at the expense of AI as a technology. »Music Business Worldwide