White House pushes tech industry to shut down market for sexually abusive AI deepfakes

The White House is calling on tech giants to patrol their app stores for programs that can make explicit images without the person's consent.

Amazon, Microsoft

WASHINGTON — President Joe Biden's administration is pushing the tech industry and financial institutions to shut down a growing market of abusive sexual images made with artificial intelligence technology.

New generative AI tools have made it easy to transform someone's likeness into a sexually explicit AI deepfake and share those realistic images across chatrooms or social media. The victims — be they celebrities or children — have little recourse to stop it.

The White House is putting out a call Thursday looking for voluntary cooperation from companies in the absence of federal legislation. By committing to a set of specific measures, officials hope the private sector can curb the creation, spread and monetization of such nonconsensual AI images, including explicit images of children.

“As generative AI broke on the scene, everyone was speculating about where the first real harms would come. And I think we have the answer,” said Biden's chief science adviser Arati Prabhakar, director of the White House's Office of Science and Technology Policy.

She described to The Associated Press a “phenomenal acceleration” of nonconsensual imagery fueled by AI tools and largely targeting women and girls in a way that can upend their lives.

“If you’re a teenage girl, if you’re a gay kid, these are problems that people are experiencing right now,” she said. “We’ve seen an acceleration because of generative AI that’s moving really fast. And the fastest thing that can happen is for companies to step up and take responsibility.”

A document shared with AP ahead of its Thursday release calls for action from not just AI developers but payment processors, financial institutions, cloud computing providers, search engines and the gatekeepers — namely Apple and Google — that control what makes it onto mobile app stores.

The private sector should step up to “disrupt the monetization” of image-based sexual abuse, restricting payment access particularly to sites that advertise explicit images of minors, the administration said.

Prabhakar said many payment platforms and financial institutions already say that they won't support the kinds of businesses promoting abusive imagery.

“But sometimes it’s not enforced; sometimes they don’t have those terms of service,” she said. “And so that’s an example of something that could be done much more rigorously.”

Cloud service providers and mobile app stores could also “curb web services and mobile applications that are marketed for the purpose of creating or altering sexual images without individuals’ consent," the document says.

And whether it is AI-generated or a real nude photo put on the internet, survivors should more easily be able to get online platforms to remove them.

The most widely known victim of pornographic deepfake images is Taylor Swift, whose ardent fanbase fought back in January when abusive AI-generated images of the singer-songwriter began circulating on social media. Microsoft promised to strengthen its safeguards after some of the Swift images were traced to its AI visual design tool.

A growing number of schools in the U.S. and elsewhere are also grappling with AI-generated deepfake nudes depicting their students. In some cases, fellow teenagers were found to be creating AI-manipulated images and sharing them with classmates.

Last summer, the Biden administration brokered voluntary commitments by Amazon, Google, Meta, Microsoft and other major technology companies to place a range of safeguards on new AI systems before releasing them publicly.

That was followed by Biden signing an ambitious executive order in October designed to steer how AI is developed so that companies can profit without putting public safety in jeopardy. While focused on broader AI concerns, including national security, it nodded to the emerging problem of AI-generated child abuse imagery and finding better ways to detect it.

But Biden also said the administration's AI safeguards would need to be supported by legislation. A bipartisan group of U.S. senators is now pushing Congress to spend at least $32 billion over the next three years to develop artificial intelligence and fund measures to safely guide it, though has largely put off calls to enact those safeguards into law.

Encouraging companies to step up and make voluntary commitments “doesn’t change the underlying need for Congress to take action here,” said Jennifer Klein, director of the White House Gender Policy Council.

Longstanding laws already prohibit making and possessing sexual images of children, even if they're fake. Federal prosecutors brought charges earlier this month against a Wisconsin man they said used a popular AI image-generator, Stable Diffusion, to make thousands of AI-generated realistic images of minors engaged in sexual conduct. An attorney for the man declined to comment after his arraignment hearing Wednesday.

But there's almost no oversight over the tech tools and services that make it possible to create such images. Some are on fly-by-night commercial websites that reveal little information about who runs them or the technology they're based on.

The Stanford Internet Observatory in December said it found thousands of images of suspected child sexual abuse in the giant AI database LAION, an index of online images and captions that’s been used to train leading AI image-makers such as Stable Diffusion.

London-based Stability AI, which owns the latest versions of Stable Diffusion, said this week that it “did not approve the release” of the earlier model reportedly used by the Wisconsin man. Such open-sourced models, because their technical components are released publicly on the internet, are hard to put back in the bottle.

Prabhakar said it's not just open-source AI technology that's causing harm.

“It's a broader problem,” she said. “Unfortunately, this is a category that a lot of people seem to be using image generators for. And it’s a place where we’ve just seen such an explosion. But I think it’s not neatly broken down into open source and proprietary systems.”

——

AP Writer Josh Boak contributed to this report.

[video_shortcode_youtube src="https://www.youtube.com/embed/yGwvqugTbEo?rel=0" itemprop="image" content="https://www.youtube.com/embed/yGwvqugTbEo?rel=0" data-src="https://www.youtube.com/embed/yGwvqugTbEo?rel=0"]

OTHER NEWS

10 minutes ago

Toronto's Zach Edey eyes top 20 pick as Raptors continue rebuild in NBA draft

10 minutes ago

Aston Martin Valiant: A Hardcore Track Star With a Manual

10 minutes ago

Wes Streeting says Labour's rail nationalisation plan would bring down fares

10 minutes ago

Anthony Joshua set to land chance to become a three-time heavyweight champion against Daniel Dubois on September 21... as Oleksandr Usyk VACATES his IBF belt after taking rematch with Tyson Fury

10 minutes ago

Kevin Rudd joins Julian Assange as he arrives for court appearance

10 minutes ago

Now Nigel Farage suggests Zelensky should make peace with Putin

10 minutes ago

Multi-talented Kea Zawadi first female presenter of Algoa FM Top 30

10 minutes ago

5 free agents Kaizer Chiefs could bring to Naturena to bolster their squad

14 minutes ago

ESPN analyst ranks Patriots defense as top-five unit in NFL

14 minutes ago

Daybreak Foods, YES Programme partner to combat youth unemployment

14 minutes ago

CNBC Daily Open: Nvidia rallies after sell-off, lifts S&P 500 and Nasdaq

19 minutes ago

Gavin Newsom blasts 'delusional California bashers' in speech

19 minutes ago

Dragon Age: The Veilguard is Teasing Answers to a Big Qunari Mystery

19 minutes ago

Fed's Bowman Says 'Not Yet' Appropriate to Cut Rates

19 minutes ago

Why Dunkin' Donuts' New Commercial Actress Looks So Familiar

19 minutes ago

Costly election pledges in France stoke fears of splurges that risk pushing country deeper into debt

19 minutes ago

Mark Cuban offloads 14 NFTs from his collection in two days

19 minutes ago

Rashee Rice on maturity: 'This is a step in a better direction for me'

19 minutes ago

Oleksandr Usyk no longer undisputed champion just a month after beating Tyson Fury

19 minutes ago

Julian Assange formally admits spying charge as part of a plea deal with US authorities

19 minutes ago

The Continuous Hockey Hall of Fame Snub of Former Maple Leafs Forward Alex Mogilny is Starting to Get Ridiculous

23 minutes ago

Kelowna city manager earned $100K more than B.C. premier in 2023

25 minutes ago

Higher ed: Top CUNY official gets eye-popping 16% pay bump, bringing salary to over $320K

25 minutes ago

Sydney’s Luna Park listed for sale, tipped to fetch $70m price tag

25 minutes ago

LeBron James And Anthony Davis Want The Lakers To Go "All-In" On Another Superstar

25 minutes ago

Kane on England - Slovenia and Bayern Munich future

25 minutes ago

Government lays foundation for social housing upswing

25 minutes ago

Glastonbury weather forecast: Festival to be dry says Met Office - but it still may rain

25 minutes ago

Fatima Payman avoids expulsion from Labor Party

25 minutes ago

Alimentation Couche-Tard earnings drop as consumers watch spending

25 minutes ago

Ravens TE says he's a 'chess piece,' will play all over field

25 minutes ago

PFF ranks Falcons' defensive line among the worst in NFL

25 minutes ago

England booed after advancing at Euro 2024

28 minutes ago

Canada beat 10-man Peru 1-0 to boost Copa America knockout stage hopes

28 minutes ago

Keppel Infrastructure Trust unit secures $612.5 million green loan

33 minutes ago

American woman goes missing while attending yoga retreat in the Bahamas

33 minutes ago

Woman pleads guilty to murder of Hollywood consultant and social justice advocate

33 minutes ago

BHP shares fall on decarbonisation update

33 minutes ago

When was the first Black Barbie made? Later than you think, as we learn in this new Netflix documentary

33 minutes ago

There may be a deeper meaning to Andy Petree's sudden retirement