“App Store” removes artificial intelligence apps that “create nude images”.

Apple has removed some artificial intelligence apps from its App Store due to apps that create sexually explicit images.

Apple has been told that there are several AI-powered image creation apps in the App Store that are advertised as “capable of creating nude images without consent”.

The companies creating these apps relied on Instagram ads to promote the scandalous apps, claiming to “undress women for free.” These ads contain links that take users to pages of applications in the App Store.

Apple did not initially respond to a request for comment, but responded immediately after the press release to request more information, the website said. Once the company received specific ad links and app store pages, the company started removing those apps from its store.

Apple removed the three apps from the App Store immediately after they were reported, meaning the company couldn't find apps that violated its policy and those apps passed the review process and didn't prevent their developers from publishing them. users.

It is worth noting that AI's capabilities to create images based on user commands have become a very useful tool in photography and design. However, they can be misused to create fake or sexual images.

See also  Are New Bitcoin Spot Funds a Game Changer?

Leave a Reply

Your email address will not be published. Required fields are marked *