CHARLOTTE, N.C. — Charlotte artist Elliana Esquivel had suspected for months her art may have been stolen. In October, she learned it had been scraped — and by artificial intelligence.
The illustrator said she had experienced a drop in commissions and online sales, which supported her suspicions.
She used the website haveibeentrained.com to search for her name and realized almost her entire portfolio had been scraped for the dataset LAION-5B.
"I've tried to get it taken down and everything, but it doesn't really matter, because it'll just get scraped again, and it'll end up back on the website," she said. "It's kind of a dystopian thing to be dealing with."
Esquivel had been making art since she was 16, and much of the art that had been taken was created when she was experiencing homelessness.
Art was her way out back then, but it's been her full-time job for four years.
WCNC Charlotte is always asking "where's the money?" If you need help, reach out to WCNC Charlotte by emailing money@wcnc.com.
This isn’t the first time Esquivel has dealt with plagiarism when it comes to her artwork. She said she first encountered the issue when she was making art in her teens, including designs being used without her permission on T-shirts or even a gallery that had stolen her work to display. This experience was different, though.
“To see that it was scraped, and that people are just reproducing it for free, with no context as to how it was made or, like, why it was made,” Esquivel said. “That was a really kind of sobering moment of ... artists need to protect their work a little bit more than we give the world credit for.”
Esquivel had started a residency at Goodyear Arts at Camp North End in September and, with the realization fresh in her mind, she decided to explore a new medium of art altogether: 3D work. Since it isn’t as “scrapeable” as the illustrations she had been creating, she said she felt it safer in the sense that it can’t be stolen as easily.
She ended up loving the medium.
“There's something so honest about it,” Esquivel said. “It's so refreshing, It's taken a lot of weight off of me just to get to explore that side of art that I never really cared that much about until this happened.”
Even with her newfound passion for 3D art, Esquivel said she has her concerns. With any form of art, she said it's getting harder to tell what role AI had in its creation, or even if AI entirely rendered the piece.
"Robots in the world of art is not something that I thought would exist in my lifetime," Esquivel said. "It sucks. Art will never be the same. You can't really put it back in the box."
AI tools can currently legally scrape works including original artwork for "training" purposes under fair use doctrine, and AI advocates argue this use of artwork constitutes research, training and educational use.
Still, multiple lawsuits have been filed from various industries and public figures arguing that the usage of these works are not always educational, and are infringing copyrights when commercial in nature, according to the Congressional Research Service.
Helen King is an assistant professor of political science and pre-law advisor at Johnson C. Smith University and holds a doctorate in political science. She said society is in a gray area with finding the guardrails for AI.
"We're, in many ways, in the wild, wild west, trying to figure out where the parameters are," King said. "So right now, I think until the law and industry standards and even academic practices begin to standardize and catch up, we're going to see a lot of issues pop up like this, where we don't really know where to go from here quite yet."
King advised artists and creators to stay up to date on what protections are in place, since so much could change and quickly. When in doubt, King said, seek direct legal advice.
"Hopefully artists who've have had intellectual property stolen and writers and such will be made whole, or there'll be some kind of remedy for them," King said. "But until we have more definitive guidelines, it's going to be difficult."
While the law catches up, there are some efforts in place to help artists. A team of researchers at the University of Chicago created a system known as "Glaze," which helps creators confuse AI by altering art with subtle changes, making it difficult to render for AI models while appearing the same to viewers. Even still, Glaze researchers call the system a "first step" toward protecting artists.
Esquivel said she believes for change to happen on a larger scale, more people need to speak up for artists through petitions and bills. That starts with more people having respect for people who have been negatively affected by AI, she said.
"Whether you're an artist or not, it's coming for you next," Esquivel said. "It's already infiltrating administrative work. I know multiple friends whose family members have already lost their administrative jobs to ChatGPT, AI. It's taking over education. It's taking over everything. And, you know, if you tell the artists to stop whining when they're bringing this up, then when they come for your job, you have to starve in silence."
Contact Emma Korynta at ekorynta@wcnc.com and follow her on Twitter.
WCNC Charlotte's Where's The Money series is all about leveling the playing in the Carolinas by helping others and breaking down barriers. WCNC Charlotte doesn't want our viewers to be taken advantage of, so we’re here to help. Watch previous stories where we ask the question “Where’s the Money” in the YouTube playlist below and subscribe to get updated when new videos are uploaded.