TodayWindows

After engineer’s complaints, Microsoft blocks terms that made violent, sexual images on Copilot Designer


Readers help support MSPoweruser. When you make a purchase using links on our site, we may earn an affiliate commission.

Read the affiliate disclosure page to find out how can you help MSPoweruser effortlessly and without spending any money. Read more

Microsoft is working to fix its AI image generation tool, Copilot Designer after it was revealed by a company engineer, Shane Jones. Jones, tasked with testing the tool’s safety, discovered that it could be used to generate disturbing content.

This included violent scenes involving teenagers, sexualized images, and biased content on sensitive topics. As we reported earlier, the tool disregarded copyright, churning out images of Disney characters in inappropriate situations.

Jones began reporting these issues internally in December 2023. While Microsoft acknowledged the problems, they didn’t take the tool offline. Jones decided to take the matter to a higher level. He contacted OpenAI and U.S. senators. Finally, he sent letters to the FTC and Microsoft’s board.

Microsoft has responded with initial steps. Certain prompts are now blocked, users receive policy violation warnings, and safety filters are being improved.

This incident exposes the challenges of AI image generation. Powerful as it may be, such technology requires strong safeguards. It also raises questions about internal communication and the responsiveness of tech giants to ethical concerns.

Can Microsoft regain trust? Only time will tell if their actions, spurred by Jones’ bravery, will lead to a more responsible approach to AI development.

More here.



Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button