We use cookies on this website. By continuing to use this site without changing your cookie settings, you agree that you are happy to accept our privacy policy and for us to access our cookies on your device.
Transcript/Script((PLAYBOOK SLUG: LogOn: AI Poisoning Software
HEADLINE: LogOn: Artists fight AI theft by 'poisoning' their digital images
TEASER: University team protects artists from style mimicry
PUBLISHED: 05/28/2024 at 8:30am
BYLINE: Matt Dibble
CONTRIBUTOR:
DATELINE: Oakland, CA
VIDEOGRAPHER: Matt Dibble
VIDEO EDITOR: Matt Dibble
ASSIGNING EDITOR: Stearns
SCRIPT EDITORS: Stearns, Reifenrath
VIDEO SOURCE (S): VOA original, Storyblocks, AFP, US Senate Committee on the Judiciary, University of Chicago, SAND Lab, Skype, Instagram
PLATFORMS (mark with X): WEB __ TV _X_ RADIO _X_
TRT: 1:58
VID APPROVED BY: Reifenrath
TYPE: TVR
EDITOR NOTES: ))
((INTRO))
Artificial intelligence image generators can mimic artists' styles, threatening their livelihood. In this edition of LogOn, Matt Dibble looks at how artists are fighting back using tools designed to disrupt AI systems.
((NARRATOR))
Freelance artist Karla Ortiz designs characters and scenes for the movie and TV industry.
((Karla Ortiz, Artist))
“Here’s some of the stuff I did for 'Black Panther.'”
((NARRATOR))
Sought after by clients for her skill and distinctive style, Ortiz was shocked to discover that major AI image generators were trained using her art and are now simulating it.
((Karla Ortiz, Artist))
“It’s really egregious. Especially because this is now an overnight industry that’s like billions of dollars competing in our own markets.”
((NARRATOR))
Image-generating AI systems are trained using billions of images gathered from the internet without permission.
((Courtesy: US Senate Committee on the Judiciary))
((NARRATOR))
Ortiz is among those challenging this practice in the American legal system. ((end courtesy))
((NARRATOR))
She is also taking direct action.
((Courtesy: SAND Lab, University of Chicago))
NATS: "Welcome to Glaze." ((end courtesy))
((NARRATOR))
She and other artists hope to protect their online images using software that confuses AI models during training.
((Courtesy: SAND Lab, University of Chicago))
((NARRATOR))
Called Glaze, its creators say the software makes tiny changes in an image that humans don’t notice, but that distorts the AI’s perception, pointing to an artist in the public domain instead, like Van Gogh.
((Courtesy: University of Chicago))
((NARRATOR))
Glaze and a follow-up version called Nightshade were developed at the University of Chicago and are free to download. ((end courtesy))
[[FOR RADIO: Shawn Shan helped lead the project. He spoke to VOA via Skype]]
((Shawn Shan, Glaze Project Lead)) ((Courtesy: Skype))
“Nightshade tries to take it one step further by not only stopping AI from learning from a piece of image, but also actively corrupts the knowledge base of these AI models.”
((Courtesy: SAND Lab, University of Chicago))
((NARRATOR))
The more the software is used, the greater the impact.
Shan says the goal is to persuade AI developers to pay for the artwork they use. ((end courtesy))
((radio: Again, Karla Ortiz))
((Karla Ortiz, Artist))
“If you get enough quantity of that out there, it's really going to start damaging some of the future products.”
((NARRATOR))
Ortiz’s painting "Musa Victoriosa," was the first work ever Glazed, and has become a symbol ((Mandatory courtesy: Instagram or Instagram bug))
of the battle ahead.
((Matt Dibble, VOA News, San Francisco))
NewsML Media TopicsArts, Culture, Entertainment and Media
Subtitles / Dubbing AvailableNo
NetworkVOA
Embargo DateMay 28, 2024 09:58 EDT
BylineMatt Dibble
Brand / Language ServiceVoice of America - English, US Agency for Global Media