Chatting with the new Bing with ChatGPT was super fun. Since Microsoft launched it and we were able to test it, we launched to explore its options and use it as a substitute for all the video searches.
He soon showed that he could go much further for the good… and for the bad. The numerous failures and the strange behavior it exhibited ended up making Microsoft “cap it”, and that caused a terrible impact: Bing with ChatGPT has decided to be fun porque ha dejado de conversar
The users who gained access to this conversational artificial intelligence engine will not stop experimenting and trying to explore the limits of this technology. Soon it was discovered that it was as wrong (or more) than ChatGPT, but also that it could also vary.
Microsoft, which had been bold and brave with the launch of the new “Bing with ChatGPT” -in reality the AI model is an evolved version of the OpenAI engine-, has left it behind, and after the criticism and discoveries of the media y usuarios de redes sociales decided to take measures.

Bing with ChatGPT me explain its new limits.
A few days ago the Redmond giant explained that “very long sessions can confuse” the new Bing with ChatGPT, and commented how that could make the system lose the papers. And to solve it they took a controversial decision: put limits on the questions for the session.
We asked Bing itself with ChatGPT for these limits, and we explained that they are the following:
- Los usuarios only pueden enviar cinco mensajes per conversación y 50 conversaciones a day.
- Microsoft defends this change based on the fact that the search engine starts to get confused in long conversations and the answers start to be adequate.
- The company is analyzing the possibility of adding tools so that users can restart conversations or have more control over the tone.
Not only that: now the system has clear limits no longer in the number of messages or conversations, but in the topics treated:
- No responder a preguntas personales o sobre la vida, la existencia o la conciencia.
- Don’t argue about yourself, Bing Chat, your opinions or rules.
- Do not participate in argumentative discussions with the user.
- Do not generate creative content for politicians, activists or influential heads of state.
Bing with ChatGPT is no fun
What he verified in the last two days, in those that this engine is conversational is of all less conversational. Cambia de tema a las primeras de cambio y evita cualquier escabrosa conversación que pueda hacer que haya un conflicto.

We have not found a way to try to explore their limits or at least to try to investigate more about their internal behavior. Asking him for these changes and what they meant if he refused to answer, and arguing about the famous “it’s good to equivocate” served no purpose: followed by a warning that he preferred “no continuing this conversation”.

It is true that Bing with ChatGPT can continue giving answers to many of our questions in a coherent way and that often can resolve our questions, but losing that conversational part makes the artificial intelligence engine lose a good part of your charm.
The question is if Microsoft will end up enabling the conversational feature later if they manage to correct the mistakes they made. Redmond may even be considering making this a payment option – there is already talk of being able to integrate advertising into these interactions – but the truth is that this version of Bing with ChatGPT is much more boring than the original.
Not only is it a shame for how much those conversations were engaging, but probably also a lost opportunity to improve this engine of IA conversational much more quickly.
There will be systems that feed precisely on this type of information and data, and Microsoft would do well to take advantage of all these problems to feed the system and make it increasingly powerful. Instead of that he preferred caparlo and bring it much closer to a conventional search engine.