Skip to main content

Dify Workflows

AI-in-a-Box can expose Dify workflows as agent tools. When DIFY_URL and a valid workflow API key are configured, the agent runtime registers run_workflow. Without those settings, the tool is absent and agents continue without Dify.

Dify is optional. It is useful for deterministic, reusable business pipelines; subagents are a better fit for open-ended reasoning tasks.

Request Flow

Seeded workflows are configured to use the AI-in-a-Box inference router. Model provider changes should normally happen in deploy/config/inference-router/ rather than by pointing Dify directly at vLLM or OpenRouter.

Seeded Workflows

The seed script creates starter workflows:

WorkflowPurpose
Executive SummaryTL;DR, key points, and explicit action items from a long document.
Email Reply DrafterFormal, casual, and one-line replies from an incoming email.
Meeting → Action ItemsOwner-assigned action items, decisions, and open questions from a transcript.
Code ReviewerSeverity-tagged review comments for a code snippet or diff.
SQL Query ExplainerPlain-English query explanation plus performance concerns.
Changelog GeneratorRelease notes from commits or issue lists.
User Story ExpanderUser story, acceptance criteria, out-of-scope notes, and open questions.
Incident Root Cause (5 Whys)Blameless incident analysis and action items.
Email Triage + ReplyMulti-branch email classification and tailored reply drafting.
CSV AnalystLightweight statistics plus narrative analysis.
Structured ExtractorTyped fields from unstructured text via function calling.

In Docker, dify-init provisions an app API key into /shared/dify-api-key, and dify-seed creates the demo workflows.

For a local manual seed:

DIFY_ADMIN_PASSWORD="$(grep '^DIFY_ADMIN_PASSWORD=' deploy/.env | cut -d= -f2-)" \
python3 scripts/seed-dify-workflows.py

Model Provider Setup

Dify 1.13 uses a plugin for OpenAI-compatible models. The deployment includes a bundled openai_api_compatible.difypkg; the seed script can install it when the plugin package is present.

Configure the provider to use:

FieldValue
ProviderOpenAI API compatible
Endpoint URLhttp://inference-router:8004/v1
API keyAny non-empty value, for example inference-router-ignored
Modelopenai/gpt-5.4-nano by default for seeded workflows
ModeChat

Do not configure seeded workflows to call http://vllm:8000/v1 or https://openrouter.ai/api/v1 directly unless you intentionally want to bypass AI-in-a-Box routing, usage capture, and observability.

Agent Runtime Integration

The integration is provided by:

ComponentResponsibility
DifyClientLists workflows and runs a workflow by app id.
run_workflow toolMatches workflow names and sends JSON inputs.
/shared/dify-workflows.jsonOptional workflow index written by the seeding path.
/shared/dify-api-keyApp API key generated by dify-init.

The agent sees run_workflow only when Dify is configured. The tool is marked always available in the registry once registered, because the agent may need it without first discovering workflows in chat.

Running a Workflow

From chat, ask for a task that matches a published workflow by name or intent, for example:

Run the Executive Summary workflow on this meeting transcript: ...

The tool input is a JSON object encoded as text. The workflow output is returned to the agent, and the agent decides how to present it.

Configuration

VariableDefaultDescription
DIFY_URLhttp://dify-api:5001 in DockerDify API URL used by agent-runtime.
DIFY_API_KEYEmptyExplicit workflow app API key.
DIFY_WORKFLOWS_INDEX/shared/dify-workflows.jsonOptional workflow name/id index.
DIFY_ADMIN_EMAILadmin@aibox.localDify admin account for init and seed.
DIFY_ADMIN_PASSWORDGenerated by make bootstrapDify admin password.
DIFY_INFERENCE_ROUTER_URLhttp://inference-router:8004/v1Router endpoint used by seeding.
DIFY_DEFAULT_MODELopenai/gpt-5.4-nanoSeeded workflow model name.

Troubleshooting

ProblemCheck
run_workflow is not availableConfirm DIFY_URL and a workflow API key are set or present in /shared/dify-api-key.
Workflows do not appearCheck /shared/dify-workflows.json and the dify-seed container logs.
Dify model errorsConfirm the OpenAI-compatible plugin is installed and the endpoint is http://inference-router:8004/v1.
Dify login failsUse the browser UI at http://localhost:3001; Dify 1.13 encodes the password for console login.
Calls bypass observabilityPoint Dify at the inference router, not direct model providers.