Sat. Nov 16th, 2024
taylor-alert-–-microsoft-ceo-satya-nadella-says-rise-of-ai-deepfake-porn-that-targeted-taylor-swift-is-‘alarming-and-terrible’Taylor Alert – Microsoft CEO Satya Nadella says rise of AI deepfake porn that targeted Taylor Swift is ‘alarming and terrible’

The CEO of Microsoft has called sexually explicit AI images of Taylor Swift ‘terrible’ and ‘alarming’ and said more needed to be done to unite ‘law enforcement and tech platforms’.

Satya Nadella on Tuesday will appear on NBC Nightly News with Lester Holt, and according to excerpts of their interview released by the channel said the images – which are believed to have been created using Microsoft tools – were disturbing.

He did not, according to excerpts, pledge any concrete change to Microsoft’s policies.

Nadella said Microsoft and other tech companies had a responsibility to install ‘all of the guardrails that we need to place around the technology so that there’s more safe content that’s being produced.’

Satya Nadella, the CEO of Microsoft, spoke to NBC News host Lester Holt. The interview will air on Tuesday

Satya Nadella, the CEO of Microsoft, spoke to NBC News host Lester Holt. The interview will air on Tuesday

And he said there needed to be more agreement on what is acceptable, and more cooperation.

‘We can do – especially when you have law and law enforcement and tech platforms that can come together – I think we can govern a lot more than we think, we give ourselves credit for.’

Pressed on what Microsoft planned to do, Nadella dodged the question.

‘I think, first of all, absolutely this is alarming and terrible, and so therefore yes, we have to act,’ he said.

‘And quite frankly all of us in the tech platform, irrespective of what your standing on any particular issue is. I think we all benefit when the online world is a safe world.

‘And so I don’t think anyone would want an online world that is completely not safe for, both for content creators and content consumers.

‘So therefore I think it behooves us to move fast on this.’

The full interview will air on Tuesday night. 

Swift, 34, was said to be deeply distressed by the images, and members of Congress have renewed their calls for the criminalization of sharing of pornographic, nonconsensual deepfakes.

The images were first spotted on Wednesday, and spread rapidly, receiving 45 million views and 24,000 reposts on X before they were removed, 19 hours later, the Verge reported.

On Thursday, tech website 404 Media discovered that the images originated in a Telegram group, which was dedicated to making non-consensual AI generated sexual images of women.

AI-generated explicit images of Taylor Swift were posted on the Celeb Jihad website, which was previously warned by the singer's lawyers after it shared another faked image in 2011. Pictured: Swift performs during the Eras Tour in Sao Paulo, Brazil, on November 24, 2023

AI-generated explicit images of Taylor Swift were posted on the Celeb Jihad website, which was previously warned by the singer’s lawyers after it shared another faked image in 2011. Pictured: Swift performs during the Eras Tour in Sao Paulo, Brazil, on November 24, 2023

The obscene images are themed around Swift's fandom of the Kansas City Chiefs, which began after she started dating star player Travis Kelce

The obscene images are themed around Swift’s fandom of the Kansas City Chiefs, which began after she started dating star player Travis Kelce

The images were generated by members of a Telegram chat group, using Microsoft programs and sharing workarounds to skirt Microsoft's rules

The images were generated by members of a Telegram chat group, using Microsoft programs and sharing workarounds to skirt Microsoft’s rules 

Members of the group were annoyed at the attention the Swift images were drawing to their work, 404 Media reported.

‘I don’t know if I should feel flattered or upset that some of these twitter stolen pics are my gen,’ one user in the Telegram group said, according to the site.

Another complained: ‘Which one of you mfs is grabbing s*** here and throwing it on Twitter?’

A third replied: ‘Well if there was any way to get this s*** shut down and raided it’s idiots like that.’

The images were not classic ‘deepfakes’, 404 Media reported, with Swift’s face superimposed onto someone else’s body.

Instead, they were entirely created by AI – with members of the group recommending Microsoft’s AI image generator, Designer.

Microsoft will not permit users to generate an image of a person, by putting the commands ‘Taylor Swift’.

But users of the Telegram group would suggest workarounds, prompting Designer to create images of ‘Taylor ‘singer’ Swift.’

Swift pictured leaving Nobu restaurant after dining with Brittany Mahomes, wife of Kansas City Chiefs quarterback Patrick Mahomes, on January 23

Swift pictured leaving Nobu restaurant after dining with Brittany Mahomes, wife of Kansas City Chiefs quarterback Patrick Mahomes, on January 23

And instead of instructing the program to create sexual poses, the users would input prompts with enough objects, colors and compositions to create the desired effect.

The Pennsylvania-born billionaire was among the first people to fall victim to deepfake porn, in 2017.

The news site reported she was also one of the first targets of DeepNude, which generated naked images from a single photo: the app has now been taken down.

Swift’s unpleasant situation has renewed a push among politicians for tighter laws.

Joe Morelle, a Democrat member of the House representing New York, introduced a bill in May 2023 that would criminalize nonconsensual sexually explicit deepfakes at the federal level.

On Thursday, he wrote on X: ‘Yet another example of the destruction deepfakes cause.’

One website called 'DeepNude' boasts that you can: 'See any girl clothless [sic] with the click of a button.' A 'standard' package on 'DeepNude' allows you to generate 100 images per month for $29.95, while $99 will get you a 'premium' of 420.

One website called ‘DeepNude’ boasts that you can: ‘See any girl clothless [sic] with the click of a button.’ A ‘standard’ package on ‘DeepNude’ allows you to generate 100 images per month for $29.95, while $99 will get you a ‘premium’ of 420.

His fellow New Yorker, Representative Yvette Clarke, also called for action.

‘What’s happened to Taylor Swift is nothing new. For yrs, women have been targets of deepfakes w/o their consent. And w/ advancements in AI, creating deepfakes is easier & cheaper,’ she said.

‘This is an issue both sides of the aisle & even Swifties should be able to come together to solve.’

Swift has not commented on the incident, and neither has X owner Elon Musk.

As of April 2023, the platform’s manipulated media policy prohibits media ‘depicting a real person [that has] been fabricated or simulated, especially through use of artificial intelligence algorithms.’

Images have frequently slipped through the net, however, and the situation worsened when Musk took over in October 2022 and gutted the moderation team.

error: Content is protected !!