Best Practices
In order to achieve the best results with our platform, we recommend the following best practices.
Use asynchronous message requests
Asynchronous message requests are a great way to improve the conversation to be human-like. If you only use realtime message requests you will have very unnatural response times and it will not feel good to the user. Therefore we recommend using asynchronous message requests.
Debounce messages on your end
Some people tend to send a message that could have been one in multiple messages. If you create asynchronous message requests for all of these you will get responses for each message. At first this might sound good but given the response time probabilities it could lead to a wrong order of the responses.
The chat could then look like this:
| User | Response |
|---|---|
| Hello | |
| How are you? | |
| I'm fine, thank you | |
| Hi |
This is obvioulsy not what you want. Therefore you should debounce messages on your end. This means that you should only send a message to the API if the last message was sent more than a certain amount of time ago. This way you can mimic the human-like conversation and ensure that the response is always in the right order.
Use contexts
Contexts are a great way to improve the conversation. They allow your agent to remember relevant information of past messages or other interactions that happened. They allow for more natural and human-like conversations.
Common Mistakes
Sending message history in the wrong order
Make sure to send messages in the right order as requested by the API specification.
Using the wrong context
If you mix up contexts between conversations with the same agent, agents will refer to things that are not part of the conversation. This will lead to a lot of confusion and will not feel natural at all.
Using the wrong model
Make sure to choose the model of your agent wisely. Test your agent in the Dashboard and make sure it responds as expected. If it does not, it is likely that the model is not the right one for your agent.