Virbe Documentation
  • Say hi to Virtual Beings!
  • Getting started
    • Introduction to Virbe platform
      • Virbe-hosted launch
      • Azure-hosted deployment
  • Dashboard Management
    • Dashboard Architecture
    • Overview
    • Profiles
      • General
      • Language
      • Persona
      • Settings
        • Web widget profile settings
        • Metahuman kiosk profile settings
      • Deployment
        • Web widget deployment
        • Metahuman kiosk deployment
    • Configurations
      • Speech-to-Text
      • Text-to-Speech
      • AI Models
      • Conversational Engines
    • Personas
    • Knowledge Base
      • Best Practices for Knowledge Base Content
    • Data tables
    • Conversation Flows
      • Getting started with Conversation Flows
      • Managing nodes
      • Nodes
        • Logic nodes
          • Route by Profile
          • If/Else Router
          • Intent Matcher
        • Flow control nodes
          • Go to Flow
          • Checkpoint
        • Response nodes
          • Text
          • LLM Response
          • Call ConvAI
          • Behavior
        • Action nodes
          • Collect User Data
          • Quick Reply
          • Cards from Table
        • Context nodes
          • Find Record(s)
        • Integration nodes
          • Call Webhook
          • Custom Action
      • Troubleshooting Conversation Flows
  • Touchpoints
    • Kiosk Apps
      • Metahuman Kiosk
        • Hardware setup
          • Recommended setup
          • Computer
          • Screen
          • Microphones
          • IoT
          • Peripherals
          • Operating System
            • Windows
              • Login & power management
              • Other
              • OpenHab
            • Mac
            • Remote Access
        • Customer Experience
          • VAD & press-to-talk
          • Microphone settings
          • Avatars
    • Web Integration
Powered by GitBook
On this page

Was this helpful?

  1. Dashboard Management
  2. Conversation Flows
  3. Nodes
  4. Logic nodes

Intent Matcher

Leverages LLM understanding to create natural, context-aware routing without needing exact keyword matches, making conversations more flexible and human-like.

PreviousIf/Else RouterNextFlow control nodes

Last updated 3 months ago

Was this helpful?

The Intent Matcher node uses LLM capabilities to understand the meaning behind user inputs and route conversations accordingly. When a user message arrives, this node analyzes it against defined intents and directs the flow based on the best match.

Setting up intent matching:

  1. Select the LLM model to use for intent analysis

  2. Define intent name (e.g., "contact-team", "request-pricing")

  3. Describe trigger situations that should activate this intent

  4. Create different paths for each intent

  5. Set up a default path for unmatched intents

Example: A virtual being needs to distinguish between product inquiries and support requests:

Intent: product-inquiry
Trigger: When user asks about products, features, or pricing

Intent: technical-support
Trigger: When user mentions problems, errors, or needs help

Default: General conversation handling

Common use cases:

  • Route to specific knowledge base sections

  • Direct to appropriate support flows

  • Identify user goals and intentions

  • Categorize queries by type

  • Handle multiple related intents

Important considerations:

  • Keep trigger descriptions clear and specific

  • Test with various phrasings of similar intentions

  • Consider common user language and expressions

  • Balance between too broad and too specific triggers

  • Monitor intent matching accuracy

  • Maintain a sensible default path for unmatched intents

Example of use of Intent Matcher node