Conversation Flows

Conversation Flows are the heart of your virtual being's intelligence.

They determine how your virtual being understands inputs, processes information, and generates responses. Think of flows as the pathways your virtual being's "thoughts" follow during conversations.

Core Concepts

Flow Triggers

Every conversation flow begins with an input, which can come from multiple sources:

  • User's text or voice input

  • System signals (like conversation start/end)

  • External integrations

  • Interface interactions (buttons, cards)

  • Custom events

Start of Main flow: all relevant events should be connected with follow-up nodes in order to

The entry point of a flow activates each time any of these inputs occurs, making your virtual being responsive to all types of interactions.

All relevant events branching out of the Start node should be connected with follow-up nodes in order to be handled predictably.

Conversation Context

Your virtual being maintains context throughout the conversation automatically. This means it remembers:

  • Previous messages

  • User information

  • Conversation history

  • Current conversation state

This context is automatically included in LLM prompts, enabling coherent, contextual conversations without additional programming.

Nodes

Conversation flows is built from blocks that we call "Nodes". Their functionality ranges from simply delivering predefined text response, through LLM responses, complex conditional logic to external integrations or webhooks.

Conversational flow is built with nodes of different types


Read more about basic concepts of building a conversational flow:

Getting started with Conversation Flows

Or learn more on how and when to use each node type:

Nodes

Last updated

Was this helpful?