Plugins
Extend your agent with LLM providers, analytics, and customer support tools
Plugins add capabilities to your agent that go beyond what workflows alone can do. Connect an LLM provider to power AI nodes, enable session analysis in Observatory, or integrate LiveChat with Helvia LiveChat or third-party platforms like Zendesk.
Each plugin belongs to a category, connects to a Workspace-level integration, and can be activated or deactivated per agent without affecting other agents.

How Plugins Work
Plugins sit between your agent and external services. The relationship flows like this:
You configure an integration at Workspace > Integrations with the provider's API key or credentials
You activate the plugin on a specific agent and link it to an integration
The agent gains access to the capability (e.g., LLM processing, live chat routing, ticket creation)
Integrations are shared across all agents in the Workspace. Plugins are configured per agent. This means two agents can use the same integration but with different plugin settings.
The Helvia LiveChat is the only plugin that does not need an integration. It works out of the box with no setup required.
Available Plugins
Plugins are organized into three groups based on where they apply in the platform.
Designer
These plugins enable nodes and features you use when building workflows on the canvas.
LLM node
OpenAI, Azure, Gemini
Powers the LLM node for natural language processing and complex business logic
Semantic Search node
OpenAI, Azure
Enables the Semantic Search node for Knowledge Base retrieval (RAG)
Language Detection
OpenAI, Azure
Automatically detects the user's input language during conversations
Observatory
These plugins power analytics and testing features in Observatory.
Session Analysis
OpenAI, Azure, Gemini
Generates summaries, sentiment scores, and insights for chat sessions
Topic Modelling
OpenAI
Groups missed questions into topics for pattern discovery
Automated Agent Testing
OpenAI
Runs automated test scenarios against your agent workflows
Customer Support
These plugins connect your agent to external customer support platforms.
LiveChat
Helvia LiveChat, Cisco, Zendesk Livechat, Genesys
Routes conversations to human agents for real-time support
CRM
Dynamics 365
Syncs customer data from Microsoft Dynamics 365 CRM
Ticketing
Zendesk Ticketing
Creates and syncs support tickets with your Zendesk account
Activating a Plugin
Select an Integration
In the settings dialog, choose an integration from the Select Integration dropdown. This links the plugin to the credentials configured in Workspace > Integrations.
If no integrations appear in the dropdown, you must first create one in Workspace > Integrations for the selected provider.
One Provider Per Category
Only one provider can be active per category at a time. When a provider is already active, the Activate buttons for all other providers in that category are greyed out. To switch providers, Deactivate the current one first, then activate the new one.
The LLM node category is the exception. You can activate multiple LLM providers simultaneously (e.g., OpenAI and Gemini). When configuring an LLM node on the canvas, you select which plugin and model to use per node, giving you flexibility to mix providers across different parts of your workflow.
Managing Plugins
Click Settings on any active plugin card to open its configuration dialog. Here you can configure the integration and swap them without deactivating the plugin. Every plugin has a dedicated settings dialog with its own configuration options.
Some plugins like Language Detection and Session Analysis also include an Expert Mode toggle. Enabling Expert Mode expands the dialog with advanced options such as custom prompts and model selection.
Deactivating a Plugin
Click Deactivate on the plugin card. The plugin stops immediately and the agent loses the associated capability.
Deactivating an LLM plugin disables all nodes in your workflows that depend on it. Verify your workflows still function correctly after deactivation.
Designer Plugins
LLM Node
The LLM node plugin connects large language models to your workflows. Unlike other categories, you can activate multiple LLM providers at the same time (e.g., OpenAI and Gemini). Each LLM node on the canvas lets you pick which plugin and model to use, so you can mix providers across different parts of a single workflow.

The available providers are OpenAI, Azure, and Gemini. You can also connect any OpenAI-compatible provider (e.g., Mistral, Groq) through an OpenAI integration. Learn more in Integrations.
Semantic Search Node
Activating this plugin enables the Semantic Search node on the canvas. The node performs retrieval queries over the Knowledge Bases connected to your agent, returning the most relevant chunks to feed into your workflow (RAG).
In the settings you can fine-tune RAG retrieval parameters. For most use cases, the default settings work well. Adjust Visit Neighbors and Exact Match only if you need to fine-tune the trade-off between search accuracy and response speed.
Pipeline ID and Access Token are both read-only and auto-generated by the platform.
Language Detection
The Language Detection plugin automatically identifies the language of every user message and updates the UserInfo.language contact variable with the detected value. Use it to route multilingual conversations or switch the agent's response language dynamically.
In Basic Mode, select a Model from the dropdown. The plugin uses a platform-managed prompt optimized for language detection. No further configuration is needed.
Toggle Expert Mode on to access advanced settings:
Model: Select the LLM model used for detection
Prompt: A rich text editor with the full system prompt. The default prompt instructs the model to return an ISO 639 language code (e.g.,
el,en,es) and handle edge cases like ambiguous input or initialismsInclude History: When enabled, previous user messages are included in the detection request for improved accuracy. Always counts at least the last message
Use Expert Mode when the default detection prompt does not handle your specific language mix or edge cases well enough.
Observatory Plugins
Session Analysis
This plugin uses an LLM to analyze chat sessions and generate structured insights. To view or trigger analysis, go to Observatory > Sessions > Chat Sessions, open a session, and scroll down to the Session Analysis section. Use Expert Mode for better control over the generated insights.

By default, the plugin generates five insights per session:
Toggle Expert Mode on to customize which insights are generated or how they are produced. The prompt must instruct the LLM to return a JSON object with a separate field for each insight. For example:
The advanced LLM configuration includes:
Model: Select the LLM model
Prompt: Rich text editor with the full system prompt, including the JSON output format for all five insights or any additional
Temperature: Controls response randomness (0-2, default
0.5)Max Tokens: Limits the response length (default
128)
You can also add multiple LLM configurations to the same plugin. Click + Add LLM to append another step to the chain.
For a full walkthrough of Session Analysis, see the Chat Sessions page.
Topic Modelling
This plugin automatically groups missed questions into topics in the background. Use it to spot recurring knowledge gaps and prioritize which content to add to your agent.
Missed questions are collected by the Missed Question node in your workflows and recorded in Observatory. Once this plugin is active, the LLM analyzes accumulated missed questions and clusters them into topics. Results appear in Observatory > Sessions > Missed Questions.
Automated Agent Testing
Automated testing is one of the most advanced features of the platform and a necessary part of the agent development lifecycle. This plugin lets you create test scenarios that validate your agent's responses are consistent, accurate, and reliable before going live.

Activate the plugin and select an LLM integration. The platform automatically creates a dedicated API deployment to run your test. You can then create, run, and review tests in Observatory > Testing, with pass/fail outcomes, detailed logs, and individual session performance insights.
For a full walkthrough of creating and running tests, see the Automated Testing page.
Customer Support Plugins
LiveChat
LiveChat plugins enable human-in-the-loop handoff. Transfers are not automatic; you control when a conversation is handed off by placing a LiveChat node in your workflow. When the node is reached, the conversation is transferred to a human agent for real-time support. Use this for complex or sensitive scenarios that require human judgement.
Four providers are available: Helvia LiveChat (our own built-in solution) and three third-party integrations (Cisco, Zendesk, Genesys). Helvia LiveChat offers the most flexibility and customization options, while the third-party providers let you route conversations to external support platforms your team already uses.
The LiveChat plugin settings include:
LiveChat Availability
Toggle LiveChat on or off for this agent. When disabled, all handoff requests are rejected.
Agent Masking
Control how agent names appear to end-users during LiveChat sessions. Available modes:
Full Name: Shows the agent's full real name (e.g., John Joe Doe)
First Name + Last Initial: Partial privacy (e.g., John D.)
First Name Only: Friendly and approachable (e.g., John)
Constant Name: Full anonymity (e.g., Agent)
Advanced Masking: Define custom rules using RegEx
Request Timeout
Set the number of seconds a LiveChat request stays pending before it expires. Available only in Helvia LiveChat.
Business Hours
Configure time slots during which LiveChat is available. The timezone is inherited from the Workspace settings. Available only in Helvia LiveChat.
Queue Configuration
Set two values to estimate waiting time for end-users:
Average LiveChat Agent Response Time (seconds) — How long agents typically take to respond
Average End-User Waiting Time (seconds) — Estimated wait based on queue position
Available only in Helvia LiveChat.
System Messages
Customize the messages sent to end-users during LiveChat events. Each message is configurable per language. Available system messages:
Conversation in progress
There is an active live-chat session at the moment.
Conversation terminated
Live-chat ended
Conversation transferred
Live-chat transferred
Generic error
An error occurred. Please, try again later.
LiveChat disabled
Live-chat is not available right now. Please, try again later.
Out of business hours
Live-chat support is currently out of business hours.
Request accepted
Live-chat started
Request already exists
Your live-chat request is currently pending and messages are not sent during this time. A live-chat agent will be with you shortly to respond to your inquiries.
Request missed
There is no agent available right now. Please, try again later.
CRM
Connect your agent and LiveChat to your CRM so conversations have full customer context. When a user interacts with the agent, the plugin pulls existing customer records, giving the agent and LiveChat operators access to relevant data without switching tools.
Ticketing
Create and manage support tickets from within agent conversations and LiveChat sessions. When a conversation requires follow-up beyond the chat session, the agent can create a ticket that syncs with your external ticketing system.
Best Practices
Start with one LLM provider: Activate a single LLM plugin (e.g., OpenAI) across all LLM-dependent categories before experimenting with others
Match the provider to the task: Use the same provider for LLM node and Session Analysis to keep costs predictable and responses consistent
Set business hours early: Configure LiveChat business hours before going live to avoid routing requests when no human agents are online
Use descriptive integration names: Name integrations in Workspace so the dropdown in plugin settings is clear
Test after switching providers: After changing a plugin's integration, run a test conversation to verify the agent responds correctly
You now know how plugins are structured, how to activate and configure them, and how they connect to Workspace integrations. Activate your first plugin and start building.
Last updated
Was this helpful?

