LLM Response
The LLM Response node enables AI-generated responses based on context and instructions.
Last updated
Was this helpful?
The LLM Response node enables AI-generated responses based on context and instructions.
Last updated
Was this helpful?
Unlike fixed Text responses, LLM responses adapt to the conversation while following your defined parameters.
A single LLM Response node can be used to create the Minimum Viable Flow. Simply connect the Start node to a LLM Response with general system instructions. In such a case, there is no need to connect the LLM Response further to any node – each user's input will trigger the Start and the single LLM Response node with all the existing conversation context and the conversation will keep going, naturally.
Select the LLM model to use
Provide system instruction (what the response should achieve)
Add any additional context beyond the conversation history
Example for filling in the Context:
Additional Context can help guide the LLM to keep the responses under certain limit of words – this makes the responses more brief and dynamic and they are easier for users to follow.
For more complex flows:
Natural conversations
Dynamic responses to queries,
Contextual explanations
Personalized interactions
Complex information delivery
Follow-up discussions