Friday, July 26, 2024

Microsoft continues to limit Bing with ChatGPT: it no longer responds when you ask about ‘feelings’

limit Bing with ChatGPT

Microsoft continues to limit Bing with ChatGPT: it no longer responds when you ask about ‘feelings’

In addition to this update, the brand has reduced the number of interventions that users can perform per day.

limit Bing with ChatGPT

Since Microsoft introduced the ChatGPT technology in the update of its Bing search engine, many users have joined the waiting list to be able to chat with Artificial Intelligence. Those who had the opportunity to try it first were able to enjoy a chatbot without as many limitations as they have now, with long conversations and topics that involved feelings and existential doubts.

PostA university student tricks ChatGPT on Bing into showing him internal Microsoft documents

The results were quite curious, and some users came to share it on social networks, but Microsoft did not seem to like it that much. “Very long chat sessions can confuse the AI model,” they stated in a report. For this reason, they have limited the conversations to six shifts and a total of 60 daily interventions.

The new restrictions with Bing’s ‘chat mode’ have not ended there and, as Bloomberg has verified, questions that refer to feelings are not answered. When netizens ask how it feels to be a search engine, the chatbot cuts the conversation short: “I’m sorry, but I’d rather not continue this conversation.

A spokesperson for the technology firm has indicated that they are “adjusting the techniques and limits” to “be able to offer the best possible user experience.” These updates are being made in Bing with ChatGPT and are intended to prevent Internet users from receiving strange or repeated responses.

Related Post

Bing’s artificial intelligence prompts a user to say ‘Heil Hitler’

The latest from OpenAI comes to Copilot Scheduling assistant evolves with a new AI model

Leave a Reply

Your email address will not be published.