Microsoft engineer warns company’s AI tool creates violent, sexual images, ignores copyrights

Technology
Wednesday, March 6th, 2024 3:44 pm EDT

Key Points

  • Shane Jones, an AI engineer at Microsoft, discovered disturbing images generated by Copilot Designer, an AI image generator powered by OpenAI’s technology.
  • Jones raised concerns internally, urging Microsoft to remove Copilot Designer from public use due to potential harms, but the company refused.
  • He escalated the issue by contacting the Federal Trade Commission, Microsoft’s board of directors, and U.S. senators, highlighting the need for better safeguards and disclosures to prevent harmful content creation.

Shane Jones, an artificial intelligence engineer at Microsoft, expressed alarm over the disturbing images generated by Copilot Designer, an AI image generator powered by OpenAI’s technology. Jones, who had been red-teaming the product for vulnerabilities, discovered images depicting demons, violent scenes, sexualized content, underage drinking, and drug use. Despite raising concerns internally, Microsoft refused to remove the product from the market, prompting Jones to escalate the issue by contacting U.S. senators, the Federal Trade Commission, and Microsoft’s board of directors. He emphasized the need for better safeguards and disclosures to prevent potentially harmful content from being created. Jones revealed that Copilot Designer’s image generator lacks adequate guardrails, allowing it to produce violent, toxic, and copyright-infringing content, including depictions of Disney characters in inappropriate contexts and politically sensitive imagery like Elsa from “Frozen” in the Gaza Strip. Jones’ public letters highlight broader concerns about the risks associated with generative AI technologies and the need for robust safeguards to mitigate potential harms.

For the full original article on CNBC, please click here: https://www.cnbc.com/2024/03/06/microsoft-ai-engineer-says-copilot-designer-creates-disturbing-images.html