workflow
workflowdev

Excalibase Workflow Engine

In active development. Core engine is functional with 83 executors, full API, and Kubernetes deployment. APIs and YAML schema may change.

A self-hosted workflow orchestration engine built with Go. Define workflows in YAML, execute tasks via a NATS-backed queue, run scripts in isolated containers, chain AI agents with MCP servers, and trigger runs via webhooks or cron schedules.

How It Works

REST API Layer (/api/v1)
  POST /api/v1/flows/:id/execute
  GET  /api/v1/executions/:id

        ↓
  Execution Engine
  - Creates Execution record
  - Creates TaskRun rows (PENDING / WAITING for DAG)
  - Publishes to NATS JetStream

        ↓  NATS message queue

  Workers (scale independently with --scale worker=N)
  1. Receive task from NATS
  2. Load previous task outputs
  3. Render templates ({{ outputs["taskName"]["field"] }})
  4. Execute via Executor Registry (83 built-in executors)
  5. Save outputs, unlock WAITING dependents
  6. Finalize when all tasks done

        ↓  Executor dispatches to runner

  Runners (auto-detected)
  ├── Docker Runner      → containers (local dev)
  └── Kubernetes Runner  → K8s Jobs (production)

  Executor categories (83 total)
  ├── Transforms (13)  — CSV, XML, Base64, Hash, Regex, Filter, Sort...
  ├── Databases (5)    — Postgres, MySQL, SQLite, Redis, MongoDB
  ├── Protocol (10)    — Slack, Discord, Telegram, SMTP, GraphQL, RSS...
  ├── SaaS (17)        — GitHub, Jira, Notion, Stripe, HubSpot, Linear...
  ├── Flow Control (9) — Map, Reduce, Group, Unique, Flatten, Lookup...
  ├── AI (6)           — OpenAI, Anthropic, Gemini, Ollama, Embeddings, Image
  ├── Utility (9)      — UUID, Random, JSON, URL, Template, Math, String...
  └── Core (14)        — HTTP, Shell, Python, Set, Log, If, Switch, ForEach...

Key properties:

  • Non-blocking — execute returns immediately with an executionId
  • Scalable — NATS JetStream queue, add workers with --scale worker=N
  • DAG-aware — WAITING tasks unlock as dependencies complete
  • K8s-native — auto-detects in-cluster and runs tasks as K8s Jobs (no Docker-in-Docker)
  • Output propagation — every task's output is available downstream via templates
  • Auth + RBAC — JWT authentication with viewer / executor / editor / admin roles
  • Audit logging — tracks all API actions

Quick Start

git clone https://github.com/excalibase/excalibase-workflow.git
cd excalibase-workflow

cp .env.example .env
# Add your API keys (optional — only needed for AI tasks)

docker compose up -d --build

Open the UI at http://localhost:3000. The API is at http://localhost:8080.

Three demo workflows are seeded automatically on first boot.

Task Types

83 Built-in Executors

CategoryCountExamples
Transforms13CSV, XML, Base64, Hash, Compress, Regex, Filter, Sort, DateTime
Databases5Postgres, MySQL, SQLite, Redis, MongoDB
Protocol10Slack, Discord, Telegram, SMTP, GraphQL, RSS, SSH
SaaS17GitHub, Jira, Notion, Airtable, Stripe, SendGrid, Twilio, HubSpot, Linear
Flow Control9Map, Reduce, Unique, Flatten, Group, Lookup, Noop, Error, Assertion
AI6OpenAI Chat, Anthropic Chat, Gemini Chat, Ollama Chat, Embeddings, Image
Utility9UUID, Random, Env, JSON, URL, HTMLStrip, Template, Math, String
Core14HTTP Request, Shell Script, Python Script, Set, Log, If, Switch, Wait, ForEach

Browse all executors via the API: GET /api/v1/plugins

Triggers

TriggerHow
WebhookPOST /api/v1/webhook/:key — call from any external system
ScheduleCron expression in workflow YAML
ManualPOST /api/v1/flows/:id/execute with optional inputs

Sample Workflows

1. New Customer → Welcome Email + Slack Alert

Triggered by a webhook when a new customer signs up. Fetches their profile, generates a personalized welcome email with AI, sends it via Resend, and pings Slack in parallel.

Workflow 1: new-customer-onboarding

WebhookPOST /webhook/new-cus…Fetch CustomerHTTP GET /customerGenerate EmailGPT-4oSend EmailResend APISlack Alert#team channel
name: new-customer-onboarding

triggers:
  - type: webhook
    key: new-customer

tasks:
  - name: fetch_customer
    type: http.request
    properties:
      url: https://api.excalibase.io/api/v1/customer
      method: GET
      headers:
        Content-Type: application/json

  - name: write_email
    type: ai.openai.chat
    depends_on: [fetch_customer]
    properties:
      model: gpt-4o
      messages:
        - role: system
          content: >
            You are a friendly customer success manager at Excalibase.
            Write a short, warm welcome email (3-4 sentences).
        - role: user
          content: 'Customer data: {{ outputs["fetch_customer"]["body"] }}'

  - name: send_email
    type: http.request
    depends_on: [write_email]
    properties:
      url: https://api.resend.com/emails
      method: POST
      headers:
        Authorization: "Bearer ${RESEND_API_KEY}"
        Content-Type: application/json
      body:
        from: welcome@excalibase.io
        to: '{{ outputs["fetch_customer"]["body"]["email"] }}'
        subject: "Welcome to Excalibase!"
        text: '{{ outputs["write_email"]["text"] }}'

  # Fires in parallel with write_email (same dependency)
  - name: slack_alert
    type: saas.slack
    depends_on: [fetch_customer]
    properties:
      webhook_url: "${SLACK_WEBHOOK_URL}"
      message: ':tada: New customer signed up!'

Run it:

curl -X POST http://localhost:8080/api/v1/webhook/new-customer \
  -H "Content-Type: application/json" \
  -d '{"customerId": 42}'

2. Scheduled Weekly Film Report → Email

Runs every Monday at 8am. Fetches top-rented films and payment totals in parallel, generates an AI summary, and emails it to the team.

Workflow 2: weekly-film-report

Cron ScheduleEvery Monday 8amFetch Top FilmsHTTP GET /filmFetch RevenueHTTP GET /paymentAI AnalysisGPT-4o-miniEmail Reportteam@excalibase.io
name: weekly-film-report

triggers:
  - type: schedule
    cron: "0 8 * * MON"

tasks:
  - name: fetch_top_films
    type: http.request
    properties:
      url: https://api.excalibase.io/api/v1/film
      method: GET

  - name: fetch_revenue
    type: http.request
    properties:
      url: https://api.excalibase.io/api/v1/payment
      method: GET

  - name: analyze
    type: ai.openai.chat
    depends_on: [fetch_top_films, fetch_revenue]
    properties:
      model: gpt-4o-mini
      messages:
        - role: user
          content: |
            Top films: {{ outputs["fetch_top_films"]["body"] }}
            Payments: {{ outputs["fetch_revenue"]["body"] }}
            Write a weekly report with: top 3 highlights, revenue trend, one recommendation.

  - name: send_report
    type: http.request
    depends_on: [analyze]
    properties:
      url: https://api.resend.com/emails
      method: POST
      headers:
        Authorization: "Bearer ${RESEND_API_KEY}"
        Content-Type: application/json
      body:
        from: reports@excalibase.io
        to: team@excalibase.io
        subject: "Weekly Film Rental Report"
        text: '{{ outputs["analyze"]["text"] }}'

3. Overdue Rental Alerts — AI Triage + Routing

Runs daily. Finds unreturned rentals, uses AI to classify severity, then routes: gentle email for mild overdue, Slack escalation for critical.

Workflow 3: overdue-rental-alerts

Daily Cron9am every dayFind OverdueHTTP GET /rentalAI ClassifyGPT-4o-miniSend Emailsmild (7-14 days)Slack Escalatecritical (14+ days)
name: overdue-rental-alerts

triggers:
  - type: schedule
    cron: "0 9 * * *"

tasks:
  - name: find_overdue
    type: http.request
    properties:
      url: https://api.excalibase.io/api/v1/rental
      method: GET

  - name: classify_and_draft
    type: ai.openai.chat
    depends_on: [find_overdue]
    properties:
      model: gpt-4o-mini
      messages:
        - role: system
          content: >
            Given overdue rentals, return JSON with two arrays:
            "critical" (over 14 days) and "mild" (7-14 days).
            Each entry: customer_email, customer_name, days_overdue, message.
        - role: user
          content: 'Overdue rentals: {{ outputs["find_overdue"]["body"] }}'

  - name: send_mild_emails
    type: script.python
    depends_on: [classify_and_draft]
    properties:
      image: python:3.11-alpine
      script: |
        import json, urllib.request, os
        data = json.loads("""{{ outputs["classify_and_draft"]["text"] }}""")
        for r in data.get("mild", []):
            req = urllib.request.Request("https://api.resend.com/emails",
                data=json.dumps({"from": "rentals@excalibase.io",
                    "to": r["customer_email"],
                    "subject": "Friendly reminder",
                    "text": r["message"]}).encode(),
                headers={"Authorization": f"Bearer {os.environ['RESEND_API_KEY']}",
                         "Content-Type": "application/json"})
            urllib.request.urlopen(req)

  - name: slack_critical
    type: saas.slack
    depends_on: [classify_and_draft]
    properties:
      webhook_url: "${SLACK_WEBHOOK_URL}"
      message: ':rotating_light: Critical overdue (14+ days) — check dashboard'

Output Chaining

Every task's output is accessible downstream via templates:

ExpressionDescription
{{ outputs["task_name"]["field"] }}Specific field from a previous task
{{ outputs["task_name"]["body"] }}Full HTTP response body
{{ outputs["task_name"]["text"] }}AI response text
${ENV_VAR}Environment variable

Deployment

Docker Compose (local dev)

docker compose up -d --build
# Scale workers:
docker compose up -d --scale worker=3

Kubernetes (Helm)

helm install excalibase ./helm/excalibase-workflow \
  --namespace excalibase --create-namespace \
  --set backend.env.OPENAI_API_KEY="sk-..."

In Kubernetes, script tasks run as native K8s Jobs — no Docker-in-Docker needed. The runner auto-detects the environment.

See the Quick Start and Deployment guides for full details.

GitHub

github.com/excalibase/excalibase-workflow