Virbe Documentation
  • Say hi to Virtual Beings!
  • Getting started
    • Introduction to Virbe platform
      • Virbe-hosted launch
      • Azure-hosted deployment
  • Dashboard Management
    • Dashboard Architecture
    • Overview
    • Profiles
      • General
      • Language
      • Persona
      • Settings
        • Web widget profile settings
        • Metahuman kiosk profile settings
      • Deployment
        • Web widget deployment
        • Metahuman kiosk deployment
    • Configurations
      • Speech-to-Text
      • Text-to-Speech
      • AI Models
      • Conversational Engines
    • Personas
    • Knowledge Base
      • Best Practices for Knowledge Base Content
    • Data tables
    • Conversation Flows
      • Getting started with Conversation Flows
      • Managing nodes
      • Nodes
        • Logic nodes
          • Route by Profile
          • If/Else Router
          • Intent Matcher
        • Flow control nodes
          • Go to Flow
          • Checkpoint
        • Response nodes
          • Text
          • LLM Response
          • Call ConvAI
          • Behavior
        • Action nodes
          • Collect User Data
          • Quick Reply
          • Cards from Table
        • Context nodes
          • Find Record(s)
        • Integration nodes
          • Call Webhook
          • Custom Action
      • Troubleshooting Conversation Flows
  • Touchpoints
    • Kiosk Apps
      • Metahuman Kiosk
        • Hardware setup
          • Recommended setup
          • Computer
          • Screen
          • Microphones
          • IoT
          • Peripherals
          • Operating System
            • Windows
              • Login & power management
              • Other
              • OpenHab
            • Mac
            • Remote Access
        • Customer Experience
          • VAD & press-to-talk
          • Microphone settings
          • Avatars
    • Web Integration
Powered by GitBook
On this page
  • Minimal Viable Flow
  • Building More Complex Flows
  • Flow Control Mechanisms
  • Example: Guided Conversation Flow

Was this helpful?

  1. Dashboard Management
  2. Conversation Flows

Getting started with Conversation Flows

Conversation flows can be as simple or as complicated as is your use case. We recommend to start simple and add complexity as needed.

PreviousConversation FlowsNextManaging nodes

Last updated 3 months ago

Was this helpful?

Creating conversation flows might seem daunting at first, but in Virbe platform it's designed to be as simple or sophisticated as you need it to be. Think of it like building with blocks – you can start with just two pieces to create a functional conversation, then add more blocks to create increasingly complex and engaging interactions.

Whether you're building a simple customer service bot or a sophisticated virtual assistant, the same basic principles apply.

The beauty of the system is that while a basic flow can handle natural conversations right out of the box, you have the power to add logic, branching paths, and integrations as your needs grow.

Minimal Viable Flow

The simplest functional flow consists of just two elements:

  1. Start node

  2. LLM Response node

This basic setup is powerful enough to handle open-ended conversations because:

  • Each user input triggers the flow

  • Conversation context is maintained automatically

  • The LLM generates contextually appropriate responses

Add specific system instructions for the LLM node to provide general guidance for the virtual being on how they should approach the conversation with the end user. The node detailed form also accepts additional context – note that System instruction is the most prioritized part of the LLM prompt, additional context bears less weight on the behavior and should be used for less important elements of the prompt.

Building More Complex Flows

Flow Control Mechanisms

As you build more sophisticated flows, two key mechanisms help control conversation progression:

  1. Wait for User Input This toggle appears in many nodes and determines whether the flow should:

  • Wait for user response before proceeding (toggle on)

  • Continue immediately to the next node (toggle off)

  1. Checkpoints Checkpoints allow you to:

  • Save the conversation state at specific points

  • Return to these points upon the next incoming event or input

  • Handle complex conversation branches

Example: Guided Conversation Flow

You can use Checkpoint nodes to make a new "starting point" for the conversation, so that each incoming trigger doesn't take the user through the same path from the very beginning. For example, you can use it to make a logical branching based on user's needs:

Start 
  → Intent Matcher (understand user need)
    → Checkpoint (save the status)
      → LLM Response (provide personalized response for user's needs)

For more complex flows, consider branching the Main flow out into separate flows to handle different logic-based scenarios rather than using Checkpoints. For that create additional flows and then use "Go to Flow" node to connect any relevant last node of your flow with the separate flow.

Minimal Viable flow: Start + LLM Response node
LLM Response node details
Cards from Table: Example of a node with "Wait for user input" option
Checkpoint node
Complex conversation branching based on Intent matching and Checkpoints