Microsoft’s Bing AI chatbot launched earlier this month and quickly came under heat for giving some quite disturbing answers. Answers varied from telling a New York Times journalist he was not happily married, to stating its desire to steal nuclear codes. The exchanges prompted Microsoft to quickly limit the number of interactions a user can
Go to Source
Author: Tim Sweezy