Add and configure Large Language Models for your virtual being that will govern the response generation as well as conversation flow logic. Supported engines:
OpenAI
Azure OpenAI
Anthropic
Ai models configuration
Where to find credentials?
Log in to a relevant service provider and find a section related to using the service by API. Below you'll find an example of such a section in OpenAI's dashboard:
API keys section in OpenAI's dashboard
You can configure multiple AI models and set one as default. Each model requires its own set of credentials and configurations.
In Conversation Flows editor you'll be able to use different AI models for different nodes requiring LLMs, so that you can use faster and cheaper models for logic processing and larger models for high quality conversational responses.