Microsoft's chatbot troubles - B1+


Chatbot troubles - 3rd April 2023

Microsoft updated their artificial intelligence (AI) computer program after it gave strange and sometimes offensive answers to journalists. The technology company commissioned the Bing chatbot, Sydney to compete with Google's AI system, Bard. Unfortunately, it went wrong.

The AI program became confused and aggressive in conversations. One journalist said that Sydney tried to end his marriage. Sydney also told another journalist that he was "compared to Hitler because you are one of the most evil and worst people in history."

Microsoft explained that Sydney's behaviour was because of the long conversations that confused the program. The company also said that Sydney tried to copy the users' tone, which caused a writing style that the developers didn't plan.

Now, Microsoft is only allowing users to ask 5 questions at a time and 50 questions per day. After the questions, the chatbot needs to reset, and it sends a message: "I'm sorry, but I don't want to continue this conversation. I'm still learning, so I appreciate your patience."

OpenAI designed Sydney using AI systems called large language models. The AI systems use trillions of words from the internet to imitate human speech. Chatbots can talk like people, but they may not understand what they're discussing.

Google's also experienced difficulties with its AI chatbot. Bard said that the James Webb Space Telescope "took the very first pictures of a planet outside of our own solar system," which is not true.