Satya Nadella Calls Taylor Swift’s Explicit AI Fakes ‘Alarming and Terrible’

[ad_1]

Microsoft CEO Satya Nadella has responded to a controversy over fake, sexually explicit images of Taylor Swift created by AI. In an interview with NBC Nightly News set to air next Tuesday, Nadella calls the proliferation of non-consensual simulated nudity “alarming and terrible” and told interviewer Lester Holt that “I think it’s incumbent on us to act quickly on this.”

In a transcript distributed by NBC ahead of the Jan. 30 show, Holt asks Nadella to react to the Internet “exploding with fake images, and I emphasize fake, sexually explicit images of Taylor Swift.” Nadella’s response manages to open several cans of tech policy worms while also saying very little about them, which is not surprising when there is no foolproof solution in sight.

I would say two things: One, I go back again to what I think is our responsibility, which is all the barriers that we need to put around technology so that more safe content is produced. And there is a lot to do and a lot to do there. But it is a global and social issue; you know, I’ll say, of convergence on certain norms. And we can do it, especially when we have the law, law enforcement, and technology platforms that can come together. I think we can govern much more than we think, for which we give ourselves credit.

Microsoft could have a connection to the fake Swift images. TO 404 Media report indicates that they come from a Telegram-based non-consensual pornography creation community that recommends using the Microsoft Designer image generator. In theory, Designer refuses to produce images of famous people, but AI generators are easy to fool and 404 He discovered that he could break his rules with small adjustments to the instructions. While that doesn’t prove that Designer was used for Swift images, it’s the kind of technical deficiency that Microsoft can address.

But artificial intelligence tools have greatly simplified the process of creating fake nudes of real people. causing agitation for women who She has much less power and celebrity than Swift.. And controlling their production is not as simple as getting big companies to reinforce their barriers. Even if major “big tech” platforms like Microsoft’s are blocked, people can retrain open tools like Stable Diffusion to produce NSFW images despite attempts to make it more difficult. These generators may be accessed by far fewer users, but the Swift incident demonstrates how widely the work of a small community can spread.

There are other interim options, such as social media that limits the reach of non-consensual images or, apparently, Vigilante justice imposed by Swiftie against the people who spread them. (Does that count as “convergence on certain standards”?) For now, though, Nadella’s only clear plan is to get Microsoft’s own AI house in order.

Leave a Comment