Notice: Undefined index: HTTP_ACCEPT_LANGUAGE in /home/stockstowatch/public_html/wp-content/mu-plugins/GrULw0.php on line 4

Notice: Undefined index: HTTP_ACCEPT_LANGUAGE in /home/stockstowatch/public_html/wp-content/mu-plugins/GrULw0.php on line 4
Microsoft Limits Bing Chat Conversation Lengths After Unsettling Interactions: Here Are The Details – Microsoft (NASDAQ:MSFT) – Stocks to Watch
  • Fri. May 17th, 2024

Microsoft Limits Bing Chat Conversation Lengths After Unsettling Interactions: Here Are The Details – Microsoft (NASDAQ:MSFT)

ByBibhu Pattnaik

Feb 18, 2023
Microsoft Limits Bing Chat Conversation Lengths After Unsettling Interactions: Here Are The Details - Microsoft (NASDAQ:MSFT)

[ad_1]

Microsoft Corp MSFT has decided to cap its Bing AI chatbot question-and-answer conversation lengths.

The new version of its search engine Bing is powered by the same OpenAI technology that works behind the famous ChatGPT

According to a company’s recent blog post, it will cap the chat sessions to “50 chat turns per day, and five chat turns per session.”

A turn is a chat conversation exchange that contains both a user question and a reply from Bing. 

The company has stated that with the cap implementation, users will get a prompt to start a new topic once a limit is reached. 

Per the post, the cap on chat conversations came into effect on Friday.

“At the end of each chat session, context needs to be cleared so the model won’t get confused. Then, click on the broom icon to the left of the search box for a fresh start,” according to the blog post.

Also Read: Microsoft-Backed OpenAI Addresses Bias Concerns, Moves To Allow User Customization Of ChatGPT

According to Microsoft, most answers Bing users looked for were found within five chat turns, and only about 1% of conversations had more than 50 messages.

“We will explore expanding the caps on chat sessions to enhance search and discovery experiences further. Your input is crucial to the new Bing experience. Please continue to send us your thoughts and ideas,” the company wrote in the post. 

The new version of Microsoft’s Bing chatbot has received flack for being extremely manipulative, defensive, and dangerous. 

The Verge has reported that the chatbot has also been called an emotionally manipulative liar, and it appears the AI-powered technology has ten different alter egos.

Previously, Bing Chat had a meltdown moment when a Redditor asked about being vulnerable to prompt injection attacks. 

Read Next: Microsoft-Backed OpenAI Addresses Bias Concerns, Moves To Allow User Customization Of ChatGPT

[ad_2]

Image and article originally from www.benzinga.com. Read the original article here.