Latest News, Local News, International News, US Politics, Economy

Microsoft’s AI Wake-Up Call: Engineer Raises Alarms on Violent, Sexual Images and Copyright Neglect

A Microsoft artificial intelligence engineer, Shane Jones, has raised serious concerns about the lack of safeguards in the company’s AI image generator, specifically its Copilot Designer. 

In a letter published on LinkedIn, Jones alleges that the tool, powered by OpenAI’s DALL-E 3 AI system, lacks essential restrictions on generating violent and sexualized images.

Structural Flaws Unveiled

Jones says that no one has responded to his repeated attempts to alert Microsoft management of these problems. Jones highlights the structural issues with Copilot Designer, highlighting the fact that the program frequently generates damaging information even in response to irrelevant cues. 

Notably, he alleges that the AI generated sexually objectifying images of women when prompted with innocuous phrases like car accident. Examples provided in the letter depict a woman in suggestive poses near a car, raising concerns about the tool’s potential to create inappropriate content.

Microsoft responds to Jones’ allegations by saying it has strong internal reporting systems in place for handling issues related to generative AI and denies neglecting safety concerns. Meets with the Office of Responsible AI were arranged for Jones, and the business claims that specialized teams assess safety risks. 

Read more: https://bloggingbigblue.com/2024/03/05/floridas-stop-woke-act-faces-setback-appeals-court-upholds-block-on-controversial-law/

Microsoft’s Influence on Public Statements

microsoft's-ai-wake-up-call-engineer-raises-alarms-on-violent-sexual-images-and-copyright-neglect
A Microsoft artificial intelligence engineer, Shane Jones, has raised serious concerns about the lack of safeguards in the company’s AI image generator, specifically its Copilot Designer.

 

Additionally, Microsoft says that it values the worker’s efforts in researching and evaluating their newest technology. Copilot Designer has been marketed as a ground-breaking tool for integrating AI into a variety of artistic and commercial projects since it was introduced as part of Microsoft’s Copilot AI companion last year. 

However, Jones argues that marketing Copilot as safe for public use is irresponsible, accusing the company of failing to disclose known risks associated with the tool. This isn’t the first time concerns have been raised about Copilot Designer. 

In January, Microsoft updated the tool over similar safety issues, closing loopholes that allowed the generation of fake, sexualized images. Jones refers to a previous incident where he reported security vulnerabilities to Microsoft, which allowed users to bypass guardrails meant to prevent the creation of harmful content.

The letter also reveals that Microsoft’s corporate, external, and legal affairs team allegedly pressured Jones to remove a LinkedIn post in which he urged OpenAI’s board of directors to suspend the availability of DALL-E 3 due to safety concerns. 

Despite complying with the request, Jones claims he received no justification from the legal department. This controversy underscores the ongoing challenges faced by AI developers in addressing biases and preventing the generation of harmful content.

Read more: https://bloggingbigblue.com/2024/03/05/march-social-security-update-new-payment-dates-for-seniors-announced/

Leave A Reply

Your email address will not be published.