AI-Enhanced Workflows
AI-Enhanced Workflows let providers add intelligent style transfer and prompt interpretation to their workflows. When consumers run an AI-enhanced workflow, their prompts are interpreted by an LLM to adapt them to your defined style.
What AI-Enhanced Means
An AI-enhanced workflow has two components:
- Style Guide — A set of rules generated from reference images that describe the aesthetic, subject preferences, composition style, and other visual characteristics
- Prompt Interpretation — At runtime, the consumer's prompt is sent to an LLM which rewrites it to better match your style guide
This is particularly useful for: - Artistic workflows where you want consistent aesthetic output - Workflows that need context-aware prompt adaptation - Any workflow where style consistency matters
How to Set It Up
Step 1: Add Reference Images
In the Rigs dashboard, go to your workflow and click the AI-Enhanced section. Upload reference images that represent the style you want to achieve.
These images are analyzed by a vision-capable LLM to extract style characteristics.
Step 2: Generate Style Guide
Click Generate Style Guide to have the LLM analyze your reference images and produce a style transfer guide. This guide is stored with your workflow and used at runtime.
Step 3: Test Interpretation
Use the Test Interpretation feature to enter a sample prompt and see how it gets rewritten based on your style guide. This helps you refine the style guide before publishing.
LLM Model Configuration
AI-enhanced features require LLM configuration. In the dashboard's Workflows tab, expand the Models panel to configure:
- Provider: Any OpenAI-compatible API (OpenRouter, local OpenClaw, Ollama, etc.)
- Model: Used for text interpretation (prompt enhancement at runtime)
- Vision Model: Used for style guide generation from reference images
Models are automatically populated from your OpenClaw config (~/.openclaw/openclaw.json). The dashboard shows connection status with a green indicator when configured correctly.
Pricing
AI-enhanced workflows include a $0.02 surcharge on top of the standard platform fee:
- Consumer sees:
provider_price × 1.4 + $0.02 - Displayed as "incl. $0.02 AI interpret" in the provider dashboard
This surcharge covers the LLM costs for prompt interpretation at runtime.
How Interpretation Works at Runtime
When a consumer runs your AI-enhanced workflow:
- Consumer enters their prompt (e.g., "a cat in space")
- The gateway sends the prompt + your style guide to the configured LLM
- LLM returns an interpreted prompt that maintains the consumer's intent but adapts it to your style
- The interpreted prompt is sent to the provider's ComfyUI for execution
The original prompt is preserved in the job metadata, but the workflow receives the interpreted version.
Requirements
- A configured LLM provider with both text and vision capabilities
- At least one reference image for style guide generation
- Workflow must be published to the network