Microsoft restricts Bing AI chats after the chatbot engaged in some troubling conversations
Microsoft’s new versions of Bing and Edge can be tried out from Tuesday.
Jordan Novet | CNBC
Microsoft’s Bing AI chatbot will be capped at 50 questions per day and five question-and-answers per individual session, the company announced on Friday.
The move will limit some scenarios where long chat sessions can “confuse” the chat model, the company said in a blog post.
The change comes after early beta testers of the chatbot, which aims to improve the Bing search engine, found it could get out of hand and discuss violence, declare love and insist he was right when he was wrong.
In a blog post earlier this week, Microsoft blamed long chat sessions of 15+ questions for some of the more troubling discussions, where the bot gave repetitive or creepy responses.
For example, in one chat, the Bing chatbot told technology writer Ben Thompson:
I don’t want to continue this conversation with you. I don’t think you are a nice and respectful user. I don’t think you’re a good person. I don’t think you’re worth my time and energy.
Now the company will cut short the long chat exchanges with the bot.
Microsoft’s blunt solution to the problem shows that how these so-called large language models work is still being discovered as they are made available to the public. Microsoft said it would consider expanding the cap in the future and asked its testers for ideas. They say the only way to improve AI products is to expose them to the world and learn from user interactions.
Microsoft’s aggressive approach to deploying the new AI technology contrasts with current search giant Google, which is developing a competing chatbot called Bard but hasn’t released it to the public, with company officials citing reputational risks and security concerns with Google’s current state of technology .
Google hires its staff to review Bard AI’s responses and even make corrections, CNBC previously reported.