VentureBeat presents: AI Unleashed – An exclusive executive event for enterprise data leaders. Network and learn with industry peers. Learn More
At a virtual Federal Trade Commission (FTC) roundtable yesterday, a deep lineup of creative workers and labor leaders representing artists demanded AI regulation of generative AI models and tools, saying that they need “consent, credit, control and compensation’ to protect their artistic output, brands, voices, likenesses and brands from AI model training, copycat output, AI-generated deepfakes and more.
The FTC’s roundtable, called “Creative Economy and Generative AI,” was held as a live webcast to “better understand the impact of generative artificial intelligence on creative fields.” The event was held several weeks after a closed-door event with Senate lawmakers that was criticized for focusing on featuring Big Tech CEOs including Tesla’s Elon Musk, Meta’s Mark Zuckerberg, OpenAI’s Sam Altman, Google’s Sundar Pichai, Microsoft’s Satya Nadella and Nvidia’s Jensen Huang of Nvidia.
FTC chair Lina Khan, who has made headlines recently after a new agency lawsuit was announced suing Amazon for Illegally maintaining monopoly power, started off by pointing out that Congress created the FTC to enforce rules of fair competition.
“Today, as we see grower, growing use of automated systems, including those sometimes marketed as artificial intelligence, we again want to make sure that we’re keeping pace that we’re fully understanding how these new tools could be affecting people on the ground in positive ways, but also potentially in harmful ways and potentially unlawful ways.”
An exclusive invite-only evening of insights and networking, designed for senior enterprise executives overseeing data stacks and strategies.
3D scans of models and AI-generated models of color
While many have heard about the current and potential impact of generative AI on authors, actors and visual artists, the roundtable included some lesser-known takes on impacts on creative workers.
For example, Sara Ziff, founder and executive director of the Model Alliance, a nonprofit research, policy and advocacy organization for people who work in the fashion industry, said that when talking about how GenAI is impacting workers, “we need to consider the context of an industry that is truly like the Wild West where workers have fewer protections at baseline.”
Fashion models are particularly concerned about the use of 3D body scans in connection with generative AI, she said — a recent poll found that nearly 18% of models have already been asked to undergo a scan by a brand or management company. In addition, they are concerned about the creation of AI-generated models, particularly AI models of color.
“Those who had been scanned described not being given information about how their scans would be used, unknowingly handing away rights to their image and not being fairly compensated for people whose livelihoods are their image,” she said.
Musicians are concerned with deepfakes
Jen Jacobsen, executive director at the Artist Rights Alliance (ARA), an artist-run nonprofit that helps musicians navigate the ever-changing creative economy, said that musicians have been using AI-driven tools for years to auto tune vocals, generate beats, and assist with studio production.
But now, she said, musicians are dealing with “expansive AI models that ingest massive amounts of musical works and mimic artists voices without obtaining creator’s consent or compensating them.”
In addition to being a copyright infringement, she said that it leads to unfair competition in the music marketplace. “Musicians work is being stolen from them and then used to create AI-generated tracks that directly compete with them,” she said.
Also, she said that AI models are now used to create deepfakes that have, among other things, depicted a band canceling a concert that wasn’t actually canceled; shown artists selling products that the artists never endorsed; or created false depictions of musicians bad mouthing their own fans.
“This isn’t a hypothetical harm,” she said. “This type of consumer deception and fraud are happening right now. It’s not only confusing to fans but humiliating to the artists themselves and undermines their public image.”
‘No AI algorithm can make something out of nothing’
Duncan Crabtree-Ireland, national executive director and chief negotiator for SAG-AFTRA, said the actors’ union is “to be clear, we at sag AFTRA are “not opposed to new technologies and we’re not opposed to the existence or even the use of AI.”
But, he added that it is “important to understand that all AI-generated content originates from a human creative source — no AI algorithm is able to make something out of nothing.”
An actor’s brand is their voice, he pointed out, as is their likeness and their unique persona. “No company should be able to appropriate that and use it however they wish without permission,” he said. “What we’re proposing is about keeping our world and our industry human-centered — AI and its algorithms must be here to serve us, not the other way around.”
‘ChatGPT would be lame and useless without our books’
Two plaintiffs in lawsuits pending against top AI companies were part of the roundtable. Author Douglas Preston is one of over a dozen authors that are part of a class-action lawsuit filed by The Authors Guild against OpenAI, accusing the company of illegally pirating hundreds of books online and using them to train its AI without consent or compensation. Other plaintiffs in that lawsuit include George R.R. Martin, Jodi Picoult, Michael Connelly and Jonathan Franzen.
“ChatGPT would be lame and useless without our books,” he said. “Just imagine what it would be like if it was only trained on text scraped from web blogs, opinions, screeds cat stories, pornography and the like.” He added that Sam Altman has “testified that books provide the really high-value literary content that large language models require” but pointed out that “this is our life’s work, we pour our hearts and our souls into our books.”
Karla Ortiz, a concept artist, illustrator and fine artist, known for her work on films like Black Panther and Doctor Strange, is part of a class-action lawsuit against Stable Diffusion and Midjourney that say the organizations have infringed the rights of “millions of artists” by training their AI tools on five billion images scraped from the web “without the consent of the original artists.”
“Making a living as a professional requires a whole life of practice and study,” she said. “The creative economy only works when the basic tenants of consent, credit compensation and transparency are followed.” AI companies, she added, “They “took our work and data to train for-profit technologies that then directly compete against us in our own market, using generative media that is meant to mimic us.”
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.