
Artificial Intelligence (AI) continues to make headlines. Most recently, a Chinese company rocked the investment world when it introduced DeepSeek resulting in a panic (some pundits called it a “Sputnik moment”). Nvidia, a U.S. computing infrastructure company lost an astounding $593 billion of its market value (a record one-day loss for any company on Wall Street) after the DeepSeek news broke.
John Stewart, host of the “The Daily Show”, quipped is anyone excited “that AI had its job replaced by AI?”[1] There is a certain amount of irony here.
There are so many elements to the AI discussion, but we’re going to focus on AI art, software that allows someone to “transform your artistic concepts into reality”.[2]
So how does that work? AI art software takes original images that have been scrubbed from the internet and then used to train the A.I. models to generate an image from a text prompt. The software takes the downloaded images and runs them through an image classifier to create a set of labels. It then takes those images and their labels and feeds them into a database to generate a text-to-image model.
That all sounds harmless, except some artists began noticing AI knockoffs or copies of their work. Is that stealing, or simply imitating? Regardless, the person who originally created the art isn’t getting paid and that is causing some to raise questions on the legality of all of this.
Let’s pause for a moment and jump into Mr. Peabody’s time machine[3] to travel back to 1999 and fondly look back at Napster, a peer-to-peer music download software which used a centralized database that housed a list of all songs being shared by connected users. While extremely popular (the service boasted around 80 million registered users at its height), it could not function without the Napster central database.
Musicians, as you might imagine, were not happy when people began sharing their songs via the music sharing platform and not getting paid. Napster ran into legal difficulties over copyright infringement and ceased operations in 2001 after losing multiple lawsuits. It eventually filed for bankruptcy in June 2002.[4]
The same issue is now taking place with AI art, but on a much grander scale. Computers are scouring the internet, looking for images to add to their databases ready for someone else to use them and create AI art. It is estimated that millions, perhaps billions of images are being saved.
There are ways for artists to protect their work. Ben Zhao, professor of Computer Sciences at the University of Chicago recently explained on the podcast “Freakonomics”[5] how he developed an app called Nightshade that “poisons” the image with incorrect data. Nightshade sprinkles a few invisible pixels of the poison on the original work so that the A.I. model will see something entirely different which causes the software to incorrectly use images thus making them unusable.
A recent ruling by the U.S. Copyright Office determined that most AI art is not protected because copyright law is primarily intended to protect the work of human creators and not computers[6]. This means if someone creates an image/artwork using AI, anyone else can copy and paste with no threat of legal action, but that is of little solace to the original artist whose work is taken/scrubbed/stolen and used for profit by someone else.
Big tech is spending big money to develop AI software, but the law is slowly starting to catch up. Open AI and Microsoft are being sued by the New York Times who argue millions of copyrighted works from news organizations were used without consent or payment. Other publishers like the Associated Press, News Corp. and Vox Media have reached content-sharing deals with OpenAI.
It’s not a new phenomenon when technology races ahead of regulations, so how to make sense of it all? Simple, follow the lawyers and the money.
[1] https://www.youtube.com/thedailyshow
[3] https://kids.kiddle.co/Wayback_Machine_(Peabody%27s_Improbable_History)
[4] https://en.wikipedia.org/wiki/Napster
[5] https://freakonomics.com/podcast/how-to-poison-an-a-i-machine/