Microsoft Engineer Warns of Company’s AI Tool?!

( – A Microsoft Engineer has recently spoken out about Microsoft’s AI tool and the risque images that it was producing for him.

Shane Jones, who was an artificial intelligence engineer at Microsoft, was experimenting with the Copilot Designer, which is Microsoft’s AI tool, and he was appalled by the images popping up on his screen. The AI tool uses text prompts to generate images, much like OpenAI’s DALL-E.

Jones had been working on this technology through a practice called red-teaming, which is where he looks for vulnerabilities in the program. While he was doing this, he saw multiple images that were not in line with Microsoft’s responsible AI principles.

The AI service has been shown to provide images of teenagers with assault rifles, sexualised images of women in violent circumstances, underage drinking, and drug use. These images were generated in the last three months and CBNC was actually able to recreate these images using search results with the Copilot tool.

“It was an eye-opening moment. It’s when I first realized, wow this is really not a safe model,” Jones said. Jones has worked with Microsoft for years but is not directly on their Copilot team. He is, however, among a group of employees who work on red-teaming for Copilot.

Jones was alarmed by his findings and immediately told Microsoft about them but the company did not take the product off the market. He was redirected to OpenAI to get it taken down but with no response, Jones went to LinkedIn to post an open letter directed to take down DALL-E 3.

“Over the last three months, I have repeatedly urged Microsoft to remove Copilot Designer from public use until better safeguards could be put in place,” wrote Jones. “Again, they have failed to implement these changes and continue to market the product to ‘Anyone. Anywhere. Any Device,’” he said. He pushed for the company to market it to mature audiences only.

He also said that the company has known about this issue and the risk of the AI tool since before the release.

Copyright 2024,