Breaking News

Microsoft Is Blocking Terms That Cause Its AI To Go Rogue Creating Offensive Images

Microsoft Is Blocking Terms That Cause Its AI To Go Rogue Creating Offensive Images
Microsoft has had to rein in its Copilot AI tool after a staff engineer wrote to the Federal Trade Commission about concerns over Copilot spitting out violent, sexual, and other inappropriate images. The self-proclaimed whistleblower also claimed he met last month with Senate staffers to share his concerns about the company’s image generator,

Go to Source
Author: Tim Sweezy