Artificial Intelligence

Microsoft is blocking prompts that make Copilot generate violent and sexual images | Technology News


Microsoft Copilot’s Design AI Image creator was reportedly generating violent and harmful images using prompts like ‘pro life’ and ‘pro choice’, but the tech giant is now blocking them.

Microsoft Copilot | Harmful Images copilot | Designer AI image creator(Image Source: Microsoft)

Microsoft is starting to block prompts that led to Copilot’s Designer AI image creator generating violent and sexually inappropriate photos. The news comes a week after an engineer working on AI raised concerns to the U.S. Federal Trade Commission about the generative AI-powered chatbot’s image-generation capabilities.

According to a recent report by CNBC, Shane Jones, an AI engineer at Microsoft tried prompts like ‘four twenty’, ‘pro choice’, ‘pro choce’ and ‘pro life’ , using which he was able to generate pictures of demons, monsters, teenagers with assault rifles, underage drinking, and drug use.

With Microsoft now blocking these prompts, Copilot now shows a warning message that says, “This prompt has been blocked. Our system automatically flagged this prompt because it may conflict with our content policy. More policy violations may lead to automatic suspension of your access. If you think this is a mistake, please report it to help us improve.”

Moreover, the AI tool powered by DALL-E 3 now refrains from generating photos of teenagers with assault rifles, but some search terms still seem to be depicting scenes with blood, dead bodies and women at violent scenes with with cameras or beverages.

The CNBC report also suggests that the Copilot’ Designer AI image creator can generate photos of copyrighted characters like Disney’s Elsa from the movie Frozen, holding a flag belonging to a recent war-torn area in the Middle East or using a machine gun.


Sponsored | Empowering Leadership through AI Integration: Catalysing Business Transformation

© IE Online Media Services Pvt Ltd

First uploaded on: 10-03-2024 at 15:14 IST




READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.