How It Works
- A message arrives at a bot
- Rule-based routes are checked first
- If no rule matches, the LLM receives the message text, sender info, and a list of all bots with their descriptions
- The LLM returns a routing decision:
{target_bot_id, target_chat_id, action, reason}
Supported APIs
Any OpenAI-compatible endpoint works:- OpenAI
- Ollama
- LM Studio
- Any compatible API
Configuration
Configure via the web UI at LLM Config:| Field | Description |
|---|---|
| API URL | OpenAI-compatible endpoint URL |
| API Key | Authentication key |
| Model | Model name (e.g., gpt-4o, llama3) |
| System Prompt | Custom system prompt for routing decisions |
| Enabled | Toggle on/off |
Bot Descriptions
Each bot can have a description explaining what it does. These descriptions are sent to the LLM to help it make better routing decisions. Set descriptions via the web UI or API (/api/bots/description).