Overview
AI-powered sub-workflow that answers questions about a your infrastructure configuration directly in a Mattermost channel or thread
Requirements
- OpenRouter/OpenAI/Anthropic API key
- Google Gemini API key — for embeddings
- Jira API credentials — Cloud or Server.
- Mattermost API credentials — to post the reply back to the channel
- Qdrant instance
- Remote MCP servers (see MCP section)
- A sub-workflow that analyses attachments
- A parent workflow that triggers this one via "Execute Workflow" with a properly shaped payload
How it works
- The workflow is triggered by another workflow
ReadIncidentContext logs the request and forwards the payload
- Call '
attachmentsAnalyzer invokes a vision sub-workflow with the file_ids
SetVars sets workflow-level constants
AI agent generates a response based on the system prompt, knowledge base, and access to repositories.
- The agent's response is posted back to Mattermost via the
Post a message node
How to use
- Upload your infrastructure documentation (Markdown, YAML, runbooks) into Qdrant
- Import the attachmentsAnalyzer sub-workflow and update the workflow
reference inside
- Deploy or point to your MCP servers
- Configure credentials
- Edit the AI Agent system message to describe your infrastructure
- In SetVars, replace the example
- Wire this workflow up to an upstream router