Microsoft Puts Caps on New Bing Usage After AI Chatbot Offered Unhinged Responses - WSJ - 0 views
-
Microsoft Corp. MSFT -1.56% is putting caps on the usage of its new Bing search engine which uses the technology behind the viral chatbot ChatGPT after testers discovered it sometimes generates glaring mistakes and disturbing responses.
-
Microsoft says long interactions are causing some of the unwanted behavior so it is adding restrictions on how it can be used.
-
Many of the testers who reported problems were having long conversations with Bing, asking question after question. With the new restrictions, users will only be able to ask five questions in a row and then will be asked to start a new topic.
- ...3 more annotations...
-
“Very long chat sessions can confuse the underlying chat model in the new Bing,” Microsoft said in a blog on Friday. “To address these issues, we have implemented some changes to help focus the chat sessions.”
-
Microsoft said in the Wednesday blog that Bing seems to start coming up with strange answers following chat sessions of 15 or more questions after which it can become repetitive or respond in ways that don’t align with its designed tone.
-
The company said it was trying to train the technology to be more reliable. It is also considering adding a toggle switch, which would allow users to decide whether they want Bing to be more or less creative with its responses.