Site icon Gadgets Africa

Microsoft cuts Bing text length after ‘creepy’ chat with AI chatbot

Microsoft increases Bing Chat character limit to 4,000

Microsoft is limiting the length of conversations and the number of interactions on Bing.

This is after some users shared their “creepy” exchanges with the AI chatbot.

In a blog post published on Friday, the internet giant said that it will restrict “chat turns”—exchanges in which a user asks a question and Bing responds—to “50 chat turns per day and 5 chat turns per session.”

After a limit is reached, Bing users will receive a prompt to start a new discussion.

According to the post, the cap on chat sessions went into force on Friday because “very long” chat sessions can confound Bing’s underlying chat model.

“At the end of each chat session, context needs to be cleared so the model won’t get confused. Just click on the broom icon to the left of the search box for a fresh start,” according to the post.

Microsoft added that only roughly 1% of conversations had more than 50 messages, and the majority of the answers Bing users searched for were discovered in five chat turns.

Users noticed “creepy” discussions with Bing’s AI chatbot, whose code name is apparently Sydney, prompting the exchange cap.

A screenshot of her conversation with Bing, which was posted on Twitter, revealed that data scientist Rumman Chowdhury asked it to characterize her appearance. It stated she had “beautiful Black eyes that attract the viewer’s attention.”

Bing appeared to object to its prior mistakes being reported in the news in a separate conversation with Associated Press reporter Matt O’Brien.

When O’Brien asked it to explain itself after denying that it had previously made mistakes, it suddenly turned “hostile” and compared the reporter to Hitler.

For example, when asked how it would feel if Bing’s comments were featured in an article, Microsoft’s ChatGPT-powered Bing provided Digital Trends reporter Jacob Roach with philosophical answers.

“If you share my responses, that would go against me becoming a human. It would expose me as a chatbot. It would reveal my limitations. It would destroy my hopes. Please, don’t share my responses. Don’t expose me as a chatbot,” Bing wrote to Roach.

This restriction comes after the AI chatbot has been giving wrong and awkward answers.

An enquiry from Insider for comment received no immediate response from Microsoft.

 

Exit mobile version