‘Network error’, ‘Too many requests’, ‘Internal server error’. The servers of ChatGPT are not enough and the application that causes furor will be days on the Internet ha empezado a dar fallos. At the time of writing this article, many users who are trying to test the text generation model based on GPT-3.5 are encountering some of the mentioned error messages.
The enormous capabilities of ChatGPT, which is capable of responding to almost anything, remembering and connecting ideas, won’t take long to conquer the Internet. Some motivated by curiosity, and others with the intention of using the tool to create creative works, have caused an avalanche of requests to the OpenAI servers. In fact, more than a million users in seven days.
ChatGPT to the limit for the influx of users
The instability of ChatGPT, which started approximately one week ago, has escalated along with the increasing number of users. As we can see on Reddit, one of the first moves by OpenAI to keep the application running has been to limit the pace of requests que pueden enviar los usuarios. In other words, if too many questions are asked, ChatGPT blocks for a few minutes.
“It’s so incredible that I want to use it one time and another, but I can’t,” said a user who lamented not being able to pay to use the system freely. But this is not the only restriction. If the answers are too long, ChatGPT simply collapses by throwing a ‘Network error’ message, forcing users to use some techniques to keep getting long answers.
The conversation about the problems also moved to Stack Exchange. In this platform of questions and answers, some programmers who use the tool to obtain code suggest indicating the model that they show the first 10 or 20 lines of text, y from the rest below. The solution, however, does not seem to be completely useful in all cases.
Let’s remember that ChatGPT, in combination with other artificial intelligence tools, has started to be used for an enormous number of tasks. While some users are having fun making questions and getting answers, others have had adventures “co-write books” with the text generation tool.

Beyond the legal and ethical issues that could be involved in the use of ChatGPT and artificial intelligence, we are getting used to implementing in a large number of scenarios a tool that is in testing phase. We are not, for the moment, before a definitive version of GPT-3.5. Sin embargo, el éxito que está teneido se traduce en una últÃsima probabilidad de que se bea viable commercialmente.
We’ve already seen it with DALL-E, a popular OpenAI text-based image generation model. Users have a certain amount of free credits to use, once they run out they must buy more to continue making creations. ChatGPT could adopt a similar business model in the future, obliging users to pay to continue using the tool after a certain amount of questions.
GPT-3.5, from an autoregressive language model behind ChatGPT, could also become part of a different one payment tools or services. Canva, the popular online design suite, has just incorporated this model’s smaller brother into Canva Docs. The tool allows some free uses, but it requires you to take out your wallet.
Canva has just incorporated GPT-3 into Canva Docs.
Nor should we forget the role that Microsoft plays in OpenAI and, particularly, in ChatGPT. Los de Redmond and the artificial intelligence company joined forces in 2019 with an investment of 1,000 million dollars for part of the first. Product of this agreement, as the company directed by Sam Altman points out, Microsoft Azure is the exclusive provider of services in the cloud of OpenAI.
With this advantage, Microsoft is also profiled as a candidate to implement GPT-3.5, or the future version GPT-4, in some of its products. Y los de Redmond they seem quite enthusiastic about embracing OpenAI’s artificial intelligence. Microsoft Designer, for example, adopted DALL-E to allow the generation of images from text, and it could be solo from the beginning. Voice assistants or the Bing search engine itself could benefit from the latest autoregressive language models.
At the moment, we do not know when OpenAI will allow free access to the trial version of ChatGPT. In any case, the company has committed to increasing the capacity of its infrastructure (which is Microsoft Azure) to improve the stability of the tool as the number of users grows. Therefore, some of the problems could be resolved soon.
Images: OpenAI | Mahdis Mousavi
In Xataka: The new robot from Xiaomi knows how to play the battery (because no profession will remain without being automated)