Apple's removal of 'nonconsensual nude' apps is latest AI battle for big tech
Apple (NASDAQ:AAPL) has removed a category of artificial intelligence apps that promised to create “nonconsensual nude images,” 404 Media reported.
The apps were initially using Meta’s (NASDAQ:META) Instagram to promote themselves, with the promise of sexually explicit images for free. The ads then took users to the App Store, the news outlet said.
Apple did not respond to a request for comment from Seeking Alpha.
A Meta spokesperson said the company “does not allow ads that contain adult content and when we identify violating ads we work quickly to remove them.”
The removal of these apps is the latest in the battle between tech companies and unwanted AI attention.
A Microsoft (NASDAQ:MSFT) employee working on artificial intelligence said in March that the company’s Copilot Designer image generator was creating violent and sexual images and the company was not taking the appropriate actions.
Microsoft eventually blocked the prompts that were used to create the graphic images.
Google (NASDAQ:GOOG) (NASDAQ:GOOGL) temporarily paused AI-generated images from its AI tool Gemini in February after some text and image responses generated by the model showed lapses in race accuracy.
Chief Executive Sundar Pichai vowed to fix the problem, calling the results “completely unacceptable” in an internal memo.
The issue of deepfake images gained national attention earlier this year after social media network X blocked searches for Taylor Swift after explicit AI-generated images of her were spread all over the site.
The White House said it was alarmed by the spread of the fraudulent photos and called on Congress to take legislative action.
(This story has been updated with a response from Meta.)