⏳ Slow response generation

“Sometimes, we see the system being slow to generate an answer. Is this because we have added too much data for it to check? Or is it because we need to add our own OpenAI account for the work to be done?”

Bot responses might slow down for a few reasons:

The most common reason is OpenAI’s server load, which can lead to a slightly longer queue for API requests. It’s usually temporary, and even with your own OpenAI account, you can’t change that.

Another factor to consider is the GPT model you’re using in your widget (Widget Settings – ChatGPT). If you’re using GPT-4, response times might be noticeably longer.

Lastly, generating responses for some complex questions can take a bit more time. For example, creating a simple answer might take three to ten seconds, but generating a complex one could take up to twenty seconds with GPT-3.5 or up to a minute with GPT-4. I know, those are quite large timeframes, but that’s how a neuro-bot with its own knowledge base operates!

Was this article helpful to you? Yes No

How can we help?