Run lightweight local workflows
An experimental, lightweight, easily configurable workflow engine for automating development, operations, data processing, and content management tasks.
What is Dyngle?
Dyngle is a YAML-based workflow automation tool that lets you define and run operations using familiar shell-like syntax. It combines the simplicity of shell scripts with the power of Python expressions and data flow operators.
Technical Foundations
- Configuration, task definition, and flow control in YAML - Define your workflows in declarative YAML files
- Operations as system commands - Use familiar shell-like syntax for executing commands
- Expressions and logic in pure Python - Leverage Python for dynamic values and logic
Key Features
- Simple Operation Definition - Define workflows as arrays of commands (learn more)
- Data Flow Operators - Pipe data between commands with
=>and->(learn more) - Template Substitution - Use
{{variable}}syntax to inject data into commands (learn more) - Python Expressions - Evaluate Python code for dynamic values (learn more)
- Sub-operations - Compose operations from other operations (learn more)
- MCP Server Support - Expose operations as tools for AI assistants (learn more)
Use Cases
Dyngle is designed for:
- Development workflow automation (build, test, deploy)
- Operations tasks (server management, monitoring)
- Data processing pipelines
- Content management workflows
- CI/CD job definitions
- AI assistant tool integration
Next Steps
Installation
Quick installation (macOS)
brew install [email protected]
python3.11 -m pip install pipx
pipx install dyngle
Requirements
Dyngle requires Python 3.11 or later.
Alternative Installation Methods
Using pip
pip install dyngle
From Source
git clone https://gitlab.com/steamwiz/dyngle.git
cd dyngle
poetry install
Verifying Installation
After installation, verify that Dyngle is working:
dyngle --help
You should see the command-line help output.
Next Steps
Getting Started
This guide will walk you through creating your first Dyngle operation.
Your First Operation
Create a file called .dyngle.yml in your current directory:
dyngle:
operations:
hello:
- echo "Hello world"
Run the operation:
dyngle run hello
You should see:
Hello world
Using Templates
Now let's make it more dynamic by using data and templates. Update your .dyngle.yml:
dyngle:
operations:
hello:
- echo "Hello {{name}}!"
Pass data to your operation via stdin:
echo "name: Francis" | dyngle run hello
Output:
Hello Francis!
Common Pitfalls
YAML Syntax with Special Characters
If your commands contain colons (:) or other special YAML characters, you may encounter parsing errors. Use multiline syntax:
dyngle:
operations:
fetch-json:
- >-
echo '{"temperature": 72}' => data
See Operations - YAML Syntax for Special Characters for details.
Shell Features Don't Work
Dyngle doesn't use a shell, so shell-specific syntax won't work:
# ❌ Won't work
- echo "Hello" | grep Hello
- export VAR=value
- ls > files.txt
# ✅ Use Dyngle's features instead
- echo "Hello" => output
- output -> grep Hello
See Operations - Command Syntax for details.
What's Next?
You've learned the basics of:
- Defining operations in YAML
- Running operations with
dyngle run - Using template substitution with
{{variable}}
Now explore more:
Configuration
Dyngle reads configuration from YAML files that define operations, expressions, values, and other settings.
Configuration File Location
Dyngle searches for configuration files in the following order (first match wins):
- Command line option:
--configparameter - Environment variable:
DYNGLE_CONFIG - Current directory:
.dyngle.yml - Home directory:
~/.dyngle.yml
Examples
Using a specific config file:
dyngle --config /path/to/config.yml run my-operation
Using an environment variable:
export DYNGLE_CONFIG=/path/to/config.yml
dyngle run my-operation
Configuration Structure
A basic configuration file has this structure:
dyngle:
operations:
# Your operations here
expressions:
# Global expressions (optional)
values:
# Global values (optional)
Imports
Configuration files can import other configuration files, allowing you to organize your operations across multiple files and share common configurations.
Basic Import
Import other configuration files using the imports: key:
dyngle:
imports:
- ~/.dyngle.yml
- ./common-operations.yml
operations:
# Operations defined here
Import Behavior
- Imports are loaded in the order specified
- Later imports override earlier ones in case of name conflicts
- Operations and expressions in the main file override all imports
- Imports are recursive - imported files can import other files
Use Cases
User-level configuration:
Create a ~/.dyngle.yml with common operations, then import it in project-specific configs:
# ~/.dyngle.yml
dyngle:
operations:
cleanup:
- rm -rf .venv
- rm -rf node_modules
# project/.dyngle.yml
dyngle:
imports:
- ~/.dyngle.yml
operations:
build:
- npm install
- npm run build
Shared team operations:
# team-shared.yml
dyngle:
operations:
deploy-staging:
- sub: build
- aws s3 sync ./dist s3://staging-bucket/
# developer's local config
dyngle:
imports:
- ./team-shared.yml
operations:
dev:
- npm run dev
Next Steps
Commands
Dyngle provides several commands for working with operations.
run
Execute a named operation from your configuration.
Syntax
dyngle run <operation-name> [arguments...] [options]
Examples
Basic execution:
dyngle run hello
With arguments (available in the operation as the args array in expressions):
dyngle run greet Alice Bob
With data from stdin:
echo "name: Alice" | dyngle run hello
With a specific config file:
dyngle --config /path/to/.dyngle.yml run deploy
Options
--display <mode>
Control step display behavior:
steps(default) - Show each step before executingnone- Suppress step display for cleaner output
dyngle run build --display none
See Display Options for more details.
list-operations
List all available public operations with their descriptions.
Syntax
dyngle list-operations
Output Format
The command outputs YAML-formatted list of operations:
operations:
build: Build the project for production
test: Run the test suite
deploy: Deploy to production
Behavior
- Shows only public operations (not those with
access: private) - Includes the
description:attribute if present - Operations without descriptions show empty descriptions
See Access Control for information about public vs private operations.
mcp
Start Dyngle as an MCP (Model Context Protocol) server, exposing operations as tools for AI assistants.
Syntax
dyngle mcp [options]
Transport Options
stdio (default)
Standard input/output transport, suitable for Claude Desktop:
dyngle mcp
http
HTTP transport:
dyngle mcp --transport http --host 127.0.0.1 --port 8000
sse
Server-Sent Events transport:
dyngle mcp --transport sse --host 127.0.0.1 --port 8000
Configuration File
Specify a configuration file for the MCP server:
dyngle --config /path/to/.dyngle.yml mcp
See MCP Server for complete setup and usage information.
Global Options
These options work with any command:
--config <path>
Specify a configuration file:
dyngle --config ./custom.yml run hello
Next Steps
Operations
Operations are the fundamental building blocks in Dyngle. An operation is a named sequence of steps that execute commands.
Basic Structure
Operations are defined under dyngle: in the configuration. The simplest form is a YAML array of command steps:
dyngle:
operations:
hello:
- echo "Hello world"
build:
- npm install
- npm run build
Run an operation:
dyngle run hello
Operation Anatomy
Simple Array Form
For operations with just steps, use the array syntax:
dyngle:
operations:
init:
- rm -rf .venv
- python3.11 -m venv .venv
- .venv/bin/pip install --upgrade pip poetry
Extended Form
When you need additional attributes, use the extended form with a steps: key:
dyngle:
operations:
build:
description: Build the project for production
access: public
return: build-info
expressions:
build-time: "datetime.now()"
steps:
- npm install
- npm run build
- echo "{{build-time}}" => build-info
Operation Attributes
description
An optional description that appears in dyngle list-operations output:
dyngle:
operations:
deploy:
description: Deploy to production
steps:
- sub: build
- aws s3 sync ./dist s3://my-bucket/
access
Controls visibility and usage. See Access Control for details.
public(default) - Can be run directly, exposed via MCP, listed in operationsprivate- Can only be called as a sub-operation
return
Specifies what value to return. See Return Values for details.
dyngle:
operations:
get-temperature:
return: temp
steps:
- curl -s "https://api.example.com/weather" => weather-data
- weather-data -> jq -r '.temperature' => temp
expressions
Local expressions available only within this operation. See Expressions for details.
dyngle:
operations:
greet:
expressions:
greeting: "'Hello ' + name + '!'"
steps:
- echo "{{greeting}}"
values
Local values available only within this operation. See Data and Templates for details.
dyngle:
operations:
deploy:
values:
environment: production
region: us-west-2
steps:
- echo "Deploying to {{environment}} in {{region}}"
Command Syntax
Operation steps use shell-like syntax but are not executed in a shell:
- Commands are parsed and executed directly by Python's
subprocess.run() - Shell-specific features like
|,>,&&, and$VARIABLEwon't work - Use Dyngle's data flow operators instead
Works:
- echo "Hello world"
- npm install
- python script.py --arg value
Doesn't work (shell syntax):
- echo "Hello" | grep Hello # Use data flow operators instead
- export VAR=value # Use expressions or values instead
- ls > files.txt # Use data flow operators instead
YAML Syntax for Special Characters
When command arguments contain special characters like colons (:) that YAML interprets as syntax, use YAML's multiline string syntax:
Problem - colons in JSON:
- echo '{"temperature": 72}' => data # YAML parsing error!
Solution - use multiline syntax:
- >-
echo '{"temperature": 72}' => data
The >- indicator tells YAML to treat the following indented text as a single-line string, preventing colon interpretation issues.
Examples
Development workflow
dyngle:
operations:
test:
description: Run the test suite
steps:
- pytest --cov=src tests/
- coverage report
lint:
description: Check code style
steps:
- black --check src/
- flake8 src/
ci:
description: Run all checks
steps:
- sub: test
- sub: lint
Data processing
dyngle:
operations:
process-data:
return: result
steps:
- curl -s "https://api.example.com/data" => raw-data
- raw-data -> jq '.items' => filtered
- filtered -> python process.py => result
Next Steps
- Understand data and templates
- Learn about expressions
- Master data flow between steps
- Compose with sub-operations
Data and Templates
Dyngle maintains a block of "Live Data" throughout an operation - a set of named values that can be injected into commands using template syntax.
Template Syntax
Use double-curly-bracket syntax ({{ and }}) to inject values into commands:
dyngle:
operations:
hello:
- echo "Hello {{name}}!"
Input Data
Pass data to operations via stdin as YAML:
echo "name: Francis" | dyngle run hello
Output:
Hello Francis!
Data Sources
Data can come from several sources:
- Stdin - YAML data piped to the operation
- Values - Declared in the configuration
- Expressions - Computed Python values
- Data flow operators - Values captured during execution using
=>
Global Values
Define values under dyngle: that are available to all operations:
dyngle:
values:
environment: production
region: us-west-2
operations:
deploy:
- echo "Deploying to {{environment}} in {{region}}"
Local Values
Define values within a specific operation:
dyngle:
operations:
greet:
values:
greeting: Hello
name: World
steps:
- echo "{{greeting}}, {{name}}!"
Nested Object Properties
Access nested properties in dictionaries using dot notation:
dyngle:
operations:
weather-report:
steps:
- curl -s "https://api.example.com/weather" => weather
- echo "Temperature: {{weather.temperature}}"
- echo "Location: {{weather.location.city}}, {{weather.location.country}}"
Important: Dot notation works only for named dictionary properties, not for array indices. You cannot use .0 or other integers in dot notation.
Working with Arrays
For arrays, use Python expressions to extract values:
dyngle:
values:
users:
- name: Alice
email: [email protected]
- name: Bob
email: [email protected]
operations:
show-users:
expressions:
first-user: get('users')[0]
first-name: get('users')[0]['name']
all-names: "[u['name'] for u in get('users')]"
steps:
- echo "First user: {{first-name}}"
Data Precedence
When names overlap, Dyngle uses this precedence (highest to lowest):
- Live data (populated via
=>operator) - Local expressions (defined in the operation)
- Global expressions (defined under
dyngle:) - Local values (defined in the operation)
- Global values (defined under
dyngle:) - Input data (from stdin)
Example:
dyngle:
values:
name: Global
expressions:
name: "'Expression'"
operations:
test:
values:
name: Local
steps:
- echo "Start: {{name}}" # "Local" (local value wins)
- echo "Override" => name
- echo "After: {{name}}" # "Override" (live data wins)
Next Steps
Expressions
Expressions are Python code snippets that compute dynamic values, which can be referenced in operation steps using template syntax.
Basic Usage
Define expressions that evaluate to values:
dyngle:
operations:
greet:
expressions:
greeting: "'Hello ' + name + '!'"
steps:
- echo "{{greeting}}"
Run it:
echo "name: Alice" | dyngle run greet
Output:
Hello Alice!
Expression Scopes
Global Expressions
Defined under dyngle: and available to all operations:
dyngle:
expressions:
timestamp: "datetime.now()"
author: "'Francis Potter'"
operations:
log:
- echo "[{{timestamp}}] Log by {{author}}"
Local Expressions
Defined within a specific operation:
dyngle:
operations:
say-hello:
expressions:
count: len(name)
steps:
- echo "Hello {{name}}! Your name has {{count}} characters."
Local expressions override global expressions with the same name.
Expression Context
Expressions evaluate in a context that includes:
- Data values - Referenced directly as Python variables
- Built-in functions -
len(),str(), etc. - Standard library modules -
datetime,math, etc. - Special functions -
get(),format(),dtformat(),Path() - Command arguments - Available as the
argsarray
Referencing Data
Data values can be referenced directly:
dyngle:
operations:
greet:
expressions:
message: "'Hello ' + name"
steps:
- echo "{{message}}"
Hyphenated Names
YAML keys can contain hyphens. To reference them in expressions:
Option 1: Replace hyphens with underscores:
dyngle:
operations:
greet:
expressions:
message: "'Hello ' + first_name" # References 'first-name'
steps:
- echo "{{message}}"
Option 2: Use the get() function:
dyngle:
operations:
greet:
expressions:
message: "'Hello ' + get('first-name')"
steps:
- echo "{{message}}"
Special Functions
get()
Retrieve values from the data context:
dyngle:
expressions:
full-greeting: "'Hello ' + get('first-name') + ' ' + get('last-name')"
The get() function can also reference other expressions:
dyngle:
expressions:
greeting: "'Hello'"
full-greeting: "get('greeting') + ' ' + name"
format()
Render a template string using the current data context:
dyngle:
values:
first-name: Alice
last-name: Smith
operations:
greet:
expressions:
full-greeting: format('Hello, {{first-name}} {{last-name}}!')
steps:
- echo "{{full-greeting}}"
The format() function supports all template syntax, including nested properties:
dyngle:
operations:
weather-report:
expressions:
report: format('Temperature in {{location.city}} is {{weather.temperature}} degrees')
steps:
- echo "{{report}}"
See Data and Templates for more about template syntax.
dtformat()
Format datetime objects as strings:
dyngle:
expressions:
now: "datetime.now()"
timestamp: "dtformat(get('now'), '%Y-%m-%d %H:%M:%S')"
operations:
log:
- echo "[{{timestamp}}] Event occurred"
Path()
Create path objects (restricted to current working directory):
dyngle:
expressions:
config-file: "Path('.dyngle.yml')"
exists: "get('config-file').exists()"
args
Access command-line arguments passed to the operation:
dyngle:
operations:
greet-arg:
expressions:
name: "args[0] if args else 'World'"
steps:
- echo "Hello {{name}}!"
Run it:
dyngle run greet-arg Alice
YAML Structure Syntax
Instead of string-based Python expressions, you can use native YAML structures:
Dictionaries
dyngle:
operations:
api-call:
expressions:
request-body:
user: get('username')
email: get('email')
timestamp: "datetime.now()"
steps:
- echo "Request: {{request-body}}"
Arrays
dyngle:
expressions:
coordinates:
- get('latitude')
- get('longitude')
- get('altitude')
Nested Structures
dyngle:
expressions:
config:
server:
host: format("{{server-host}}")
port: "int(get('server-port'))"
database:
name: format("{{db-name}}")
connection:
- format("{{db-host}}")
- "int(get('db-port'))"
Important Notes:
- Each string in a YAML structure is evaluated as a Python expression
- Numbers, booleans, and None pass through unchanged
- String literals require Python string syntax:
"'literal string'" - Access nested properties in templates using dot notation:
{{config.server.host}} - Access array elements in expressions using Python brackets:
get('coordinates')[0]
Available Python Features
Expressions support:
Built-in types and functions:
str(),int(),float(),bool(),len(), etc.
Standard library modules:
datetime- Date and time operationsmath- Mathematical functionsPath()- File path operations (restricted to current directory)
Data structures:
- Lists, dictionaries, tuples
- List comprehensions
- Dictionary comprehensions
Operators:
- Arithmetic:
+,-,*,/,//,%,** - Comparison:
==,!=,<,>,<=,>= - Logical:
and,or,not - String: concatenation, formatting
Control flow (in comprehensions):
if/elsein expressionsforloops in comprehensions
Expression Examples
String manipulation
dyngle:
expressions:
uppercase-name: "name.upper()"
initials: "'.'.join([word[0] for word in name.split()])"
Mathematical operations
dyngle:
expressions:
circle-area: "math.pi * radius ** 2"
rounded: "round(get('circle-area'), 2)"
Date and time
dyngle:
expressions:
now: "datetime.now()"
today: "get('now').date()"
formatted-date: "dtformat(get('now'), '%B %d, %Y')"
List operations
dyngle:
values:
numbers: [1, 2, 3, 4, 5]
expressions:
doubled: "[n * 2 for n in get('numbers')]"
sum-numbers: "sum(get('numbers'))"
max-number: "max(get('numbers'))"
Conditional logic
dyngle:
expressions:
environment: "get('env') if get('env') else 'development'"
log-level: "'DEBUG' if get('environment') == 'development' else 'INFO'"
Next Steps
Data Flow
Dyngle provides special operators to pass data between steps in an operation, enabling powerful data processing workflows.
Data Flow Operators
Data Assignment Operator (=>)
Captures stdout from a command and assigns it to a named value:
dyngle:
operations:
fetch-data:
- curl -s "https://api.example.com/users" => users
- echo "Fetched: {{users}}"
The value becomes available in the data context for subsequent steps.
Data Input Operator (->)
Passes a data value as stdin to a command:
dyngle:
operations:
process-data:
- curl -s "https://api.example.com/data" => raw-data
- raw-data -> jq '.items' => filtered
- filtered -> python process.py
Combining Operators
You can use both operators in a single step:
<input-variable> -> <command> => <output-variable>
Example:
dyngle:
operations:
weather:
- curl -s "https://api.example.com/weather" => weather-data
- weather-data -> jq -j '.temperature' => temp
- echo "Temperature: {{temp}} degrees"
Practical Examples
API Data Processing
dyngle:
operations:
get-user-emails:
return: emails
steps:
- curl -s "https://api.example.com/users" => users
- users -> jq -r '.[].email' => emails
Multi-step Pipeline
dyngle:
operations:
analyze-logs:
return: summary
steps:
- curl -s "https://logs.example.com/today" => logs
- logs -> grep "ERROR" => errors
- errors -> wc -l => error-count
- echo "Found {{error-count}} errors" => summary
Data Transformation
dyngle:
operations:
transform-json:
return: result
steps:
- cat input.json => raw
- raw -> jq '.data | map({id, name})' => transformed
- transformed -> python format.py => result
Important Notes
Operator Spacing
Operators must be isolated with whitespace:
Correct:
- command => output
- input -> command
- input -> command => output
Incorrect:
- command=>output # Missing spaces
- input->command # Missing spaces
Operator Order
When using both operators, they must appear in this order:
- Input operator (
->) first - Command in the middle
- Output operator (
=>) last
Data Precedence
Values populated with => have the highest precedence in the data context.
See Data and Templates for complete precedence rules.
Using Expressions with Data Flow
You can reference captured data in expressions:
dyngle:
operations:
process:
expressions:
message: "format('Processed {{count}} items')"
steps:
- curl -s "https://api.example.com/items" => items
- items -> jq 'length' => count
- echo "{{message}}"
Accessing Nested Properties
When data flow captures structured data (JSON, YAML), use dot notation to access nested properties:
dyngle:
operations:
weather:
steps:
- curl -s "https://api.example.com/weather" => weather-data
- echo "Temperature: {{weather-data.temperature}}"
- echo "City: {{weather-data.location.city}}"
Remember: Dot notation works only for dictionary properties, not for array indices. For arrays, extract values using expressions or tools like jq.
Next Steps
Sub-operations
Operations can call other operations as steps, enabling composability and code reuse.
Basic Usage
Use the sub: key to call another operation:
dyngle:
operations:
greet:
- echo "Hello!"
greet-twice:
steps:
- sub: greet
- sub: greet
Passing Arguments
Sub-operations can accept arguments using the args: key:
dyngle:
operations:
greet-person:
expressions:
person: "args[0]"
steps:
- echo "Hello, {{person}}!"
greet-team:
steps:
- sub: greet-person
args: ['Alice']
- sub: greet-person
args: ['Bob']
Scoping Rules
Sub-operations follow clear scoping rules that separate declared values from live data:
Declared Values (Locally Scoped)
Values and expressions declared via values: or expressions: keys are local to each operation:
- A parent operation's declared values are NOT visible to child sub-operations
- A child sub-operation's declared values do NOT leak to the parent operation
- Each operation only sees its own declared values plus global declared values
Live Data (Globally Shared)
Data assigned via the => operator persists across all operations:
- Live data populated by a sub-operation IS available to the parent after the sub-operation completes
- This allows operations to communicate results through shared mutable state
Example
dyngle:
values:
declared-val: global
operations:
child:
values:
declared-val: child-local
steps:
- echo {{declared-val}} # Outputs "child-local"
- echo "result" => live-data
parent:
steps:
- echo {{declared-val}} # Outputs "global"
- sub: child
- echo {{declared-val}} # Still outputs "global"
- echo {{live-data}} # Outputs "result" (persisted from child)
Return Values from Sub-operations
When a sub-operation specifies a return: key, the parent operation can capture the return value:
dyngle:
operations:
get-temperature:
return: temp
steps:
- curl -s "https://api.example.com/weather" => data
- data -> jq -r '.temperature' => temp
weather-report:
steps:
- sub: get-temperature
=> temperature
- echo "Current temperature: {{temperature}} degrees"
Composition Patterns
Build Pipeline
dyngle:
operations:
install-deps:
- npm install
compile:
- npm run build
test:
- npm test
build:
description: Full build pipeline
steps:
- sub: install-deps
- sub: compile
- sub: test
Reusable Components
dyngle:
operations:
setup-env:
access: private
steps:
- echo "Setting up environment..."
- export NODE_ENV=production
deploy-frontend:
steps:
- sub: setup-env
- npm run deploy:frontend
deploy-backend:
steps:
- sub: setup-env
- npm run deploy:backend
Data Processing Chain
dyngle:
operations:
fetch-data:
return: raw-data
steps:
- curl -s "https://api.example.com/data" => raw-data
transform-data:
expressions:
input: "args[0]"
return: transformed
steps:
- input -> jq '.items' => transformed
process-all:
return: result
steps:
- sub: fetch-data
=> data
- sub: transform-data
args: [data]
=> result
Best Practices
Use Private Operations for Helpers
Mark helper operations as private to prevent direct execution:
dyngle:
operations:
deploy:
steps:
- sub: validate
- sub: build
- sub: upload
validate:
access: private
steps:
- echo "Validating configuration..."
build:
access: private
steps:
- npm run build
upload:
access: private
steps:
- aws s3 sync ./dist s3://my-bucket/
Use Private Operations for Secrets
Private operations are particularly useful for operations that generate or fetch secrets:
dyngle:
operations:
get-api-token:
access: private
return: token
steps:
- aws secretsmanager get-secret-value --secret-id api-token => secret
- secret -> jq -r '.SecretString' => token
call-api:
description: Make authenticated API call
steps:
- sub: get-api-token
=> token
- curl -H "Authorization: Bearer {{token}}" https://api.example.com/data
This prevents accidental exposure of the token operation via dyngle run or the MCP server.
Share Data via Return Values
Instead of relying on implicit live data sharing, prefer explicit return values:
Good:
dyngle:
operations:
get-version:
return: version
steps:
- cat package.json => pkg
- pkg -> jq -r '.version' => version
tag-release:
steps:
- sub: get-version
=> ver
- git tag "v{{ver}}"
Less clear:
dyngle:
operations:
get-version:
steps:
- cat package.json => pkg
- pkg -> jq -r '.version' => version
tag-release:
steps:
- sub: get-version
- git tag "v{{version}}" # Implicit dependency
Next Steps
Return Values
Operations can specify a return value that can be used by calling code or displayed when running the operation.
Basic Usage
Use the return: key to specify what value to return:
dyngle:
operations:
get-temperature:
return: temp
steps:
- curl -s "https://api.example.com/weather" => weather-data
- weather-data -> jq -r '.temperature' => temp
When you run this operation, the value of temp will be displayed:
dyngle run get-temperature
Output:
72
Return Value Sources
The return: key can reference:
- Data values set via the
=>operator - Values from the
values:section - Expressions from the
expressions:section
dyngle:
operations:
get-info:
values:
static-value: Hello
expressions:
computed: "'World'"
return: result
steps:
- echo "Dynamic" => result
Output Formatting
Return values are formatted based on their type:
- Strings and simple types - Printed as-is
- Dictionaries and lists - Formatted as YAML
dyngle:
operations:
get-user:
return: user
steps:
- curl -s "https://api.example.com/user/123" => user
Output:
name: Alice Smith
email: [email protected]
role: admin
Script Mode vs Function Mode
Operations behave differently depending on whether they have a return: key:
Script Mode (without return:)
Operations without return: behave like shell scripts - all command stdout is displayed:
dyngle:
operations:
build:
- echo "Starting build..."
- npm install
- npm run build
- echo "Build complete!"
All output is visible, making these ideal for build, deploy, and other workflow tasks.
Function Mode (with return:)
Operations with return: behave like functions - stdout is suppressed except for the return value:
dyngle:
operations:
get-temperature:
return: temp
steps:
- echo "Fetching weather..." # stdout suppressed
- curl -s "https://api.example.com/weather" => weather-data
- weather-data -> jq -r '.temperature' => temp
Only the return value is displayed, making these ideal for data queries and transformations.
Important:
- stderr is always displayed in both modes
- The
=>operator works in both modes (capturing stdout to a variable)
Using Return Values in Sub-operations
Parent operations can capture return values from sub-operations:
dyngle:
operations:
get-version:
return: version
steps:
- cat package.json => pkg
- pkg -> jq -r '.version' => version
tag-release:
steps:
- sub: get-version
=> ver
- git tag "v{{ver}}"
- git push origin "v{{ver}}"
MCP Server Integration
When operations are exposed via the MCP server:
- Operations with
return:return{"result": <value>}with the computed value - Operations without
return:return{"result": null} - Errors return
{"error": "<message>"}
This makes operations with return values particularly useful as AI assistant tools:
dyngle:
operations:
get-weather:
description: Get current weather for a city
return: weather-info
expressions:
city: "args[0]"
steps:
- curl -s "https://api.example.com/weather?city={{city}}" => weather-info
An AI assistant could call this and receive structured data to incorporate into responses.
Examples
Simple string return
dyngle:
operations:
get-timestamp:
return: timestamp
expressions:
timestamp: "dtformat(datetime.now(), '%Y-%m-%d %H:%M:%S')"
Structured data return
dyngle:
operations:
system-info:
return: info
expressions:
info:
hostname: "args[0] if args else 'localhost'"
timestamp: "datetime.now()"
user: "'admin'"
Computed return value
dyngle:
operations:
calculate-total:
return: total
steps:
- curl -s "https://api.example.com/items" => items
- items -> jq '[.[] | .price] | add' => total
Next Steps
Display Options
Control how Dyngle displays operation steps during execution.
The --display Option
The run command supports a --display option to control step visibility:
dyngle run <operation> --display <mode>
Available Modes
steps (default)
Show each step before executing it:
dyngle run build --display steps
Output:
$ npm install
[npm install output here...]
$ npm run build
[build output here...]
none
Suppress step display for cleaner output:
dyngle run build --display none
Output:
[npm install output here...]
[build output here...]
When to Use Each Mode
Use steps mode when:
- Debugging operations - See exactly what commands are being executed
- Learning - Understand what's happening during execution
- Development - Verify that template substitution is working correctly
- Interactive use - Get visual confirmation of progress
Use none mode when:
- Scripting - Cleaner output for parsing or processing
- Production workflows - Reduce noise in logs
- Return value focused - When you only care about the final result
- Automated systems - CI/CD environments where step display is unnecessary
Examples
Development workflow with step display
dyngle run test --display steps
Useful for seeing exactly what test commands are being run.
Production deployment with clean output
dyngle run deploy --display none
Keeps deployment logs focused on command output without displaying each step.
Combining with return values
When an operation has a return value, none mode is particularly useful:
dyngle:
operations:
get-version:
return: version
steps:
- cat package.json => pkg
- pkg -> jq -r '.version' => version
With steps mode:
$ dyngle run get-version --display steps
Output:
$ cat package.json => pkg
$ pkg -> jq -r '.version' => version
1.2.3
With none mode:
$ dyngle run get-version --display none
Output:
1.2.3
Interaction with Return Values
Display options work with the script mode vs function mode behavior:
- Script mode (no
return:): Step display controlled by--displayoption - Function mode (with
return:): Stdout already suppressed;--displaycontrols step visibility
See Return Values for more details on script vs function mode.
stderr is Always Displayed
Regardless of the --display setting, stderr is always shown. This ensures error messages and warnings remain visible:
dyngle run failing-operation --display none
Will still show error output from commands, even though steps are hidden.
Default Behavior
If you don't specify --display, it defaults to steps mode for maximum visibility during development and debugging.
Next Steps
Access Control
Operations can specify an access: attribute to control their visibility and usage.
Access Levels
public (default)
Public operations can be:
- Run directly via
dyngle run - Exposed as tools through the MCP server
- Listed in
dyngle list-operationsoutput
dyngle:
operations:
deploy:
access: public # Explicitly public (default if omitted)
description: Deploy to production
steps:
- sub: build
- aws s3 sync ./dist s3://my-bucket/
If access: is not specified, operations default to public.
private
Private operations can only be called as sub-operations by other operations. They cannot be:
- Run directly via
dyngle run(will fail with an error) - Exposed through the MCP server
- Listed in
dyngle list-operationsoutput
dyngle:
operations:
build:
access: private
steps:
- npm install
- npm run build
deploy:
steps:
- sub: build # OK - called as sub-operation
- aws s3 sync ./dist s3://my-bucket/
Attempting to run a private operation directly:
dyngle run build
Results in an error:
Error: Operation 'build' is private and cannot be run directly
Use Cases for Private Operations
Helper Operations
Extract common functionality into private helpers:
dyngle:
operations:
setup-environment:
access: private
steps:
- echo "Setting up environment..."
- export NODE_ENV=production
deploy-frontend:
steps:
- sub: setup-environment
- npm run deploy:frontend
deploy-backend:
steps:
- sub: setup-environment
- npm run deploy:backend
Secret Management
Prevent accidental exposure of operations that handle secrets:
dyngle:
operations:
get-api-token:
access: private
return: token
steps:
- aws secretsmanager get-secret-value --secret-id api-token => secret
- secret -> jq -r '.SecretString' => token
call-api:
description: Make authenticated API call
steps:
- sub: get-api-token
=> token
- curl -H "Authorization: Bearer {{token}}" https://api.example.com/data
This prevents:
- Running
dyngle run get-api-tokenaccidentally - Exposing the token operation through the MCP server
- Listing the sensitive operation in
dyngle list-operations
Internal Implementation Details
Hide implementation details from users:
dyngle:
operations:
validate-config:
access: private
steps:
- cat config.yml => config
- config -> python validate.py
transform-data:
access: private
steps:
- cat data.json => raw
- raw -> jq '.items' => items
process:
description: Process data with validation
steps:
- sub: validate-config
- sub: transform-data
- echo "Processing complete"
Users only see and can run process, not the internal helpers.
Multi-Step Workflows
Build complex workflows from smaller private operations:
dyngle:
operations:
install-dependencies:
access: private
- npm install
run-tests:
access: private
- npm test
build-artifacts:
access: private
- npm run build
upload-artifacts:
access: private
- aws s3 sync ./dist s3://my-bucket/
ci-pipeline:
description: Run full CI/CD pipeline
steps:
- sub: install-dependencies
- sub: run-tests
- sub: build-artifacts
- sub: upload-artifacts
Interaction with MCP Server
When running as an MCP server:
- Public operations are exposed as tools that AI assistants can discover and use
- Private operations are not exposed and cannot be called by AI assistants
This is important for security - private operations containing sensitive logic or credentials won't be accidentally triggered by AI assistants.
Best Practices
Make Public Operations User-Facing
Public operations should represent complete, user-facing actions:
dyngle:
operations:
deploy:
description: Deploy application to production
test:
description: Run the test suite
build:
description: Build the application
Make Private Operations Modular
Private operations should be focused, reusable components:
dyngle:
operations:
setup-env:
access: private
# Focused: just environment setup
validate:
access: private
# Focused: just validation
upload:
access: private
# Focused: just uploading
Use Descriptive Names
Since private operations won't be listed, use clear names that indicate their purpose:
dyngle:
operations:
_internal_cleanup: # prefix convention for internal ops
access: private
_fetch_credentials:
access: private
Next Steps
MCP Server
Dyngle can run as an MCP (Model Context Protocol) server, exposing operations as tools that AI assistants like Claude can execute.
What is MCP?
The Model Context Protocol (MCP) is a standardized protocol that allows AI assistants to discover and use external tools. When Dyngle runs as an MCP server, your configured operations become tools that AI assistants can call to perform tasks.
Starting the Server
Use the mcp command to start the server:
dyngle mcp
By default, this starts a server using the stdio transport, suitable for Claude Desktop integration.
Transport Options
stdio (default)
Standard input/output transport, ideal for Claude Desktop:
dyngle mcp
http
HTTP transport for remote connections:
dyngle mcp --transport http --host 127.0.0.1 --port 8000
sse
Server-Sent Events transport:
dyngle mcp --transport sse --host 127.0.0.1 --port 8000
How Operations Become Tools
When the MCP server starts:
- Each public operation becomes an MCP tool
- Private operations are not exposed
- Tools accept two parameters:
data- Dictionary/JSON object (equivalent to stdin data)args- List of arguments (equivalent to CLI arguments)
Tool Response Format
Tools return JSON responses:
Success:
{"result": <value>}
Where <value> is the operation's return value (if specified), or null if no return value.
Failure:
{"error": "<message>"}
Example Operation as Tool
dyngle:
operations:
get-weather:
description: Get current weather for a city
return: weather-info
expressions:
city: "args[0] if args else 'Unknown'"
steps:
- curl -s "https://api.example.com/weather?city={{city}}" => weather-info
An AI assistant can call this tool:
{
"tool": "get-weather",
"args": ["San Francisco"]
}
And receive:
{
"result": {
"temperature": 72,
"conditions": "Sunny",
"humidity": 65
}
}
Configuring Claude Desktop
To use Dyngle operations with Claude Desktop, configure the MCP server in Claude's configuration file.
macOS Configuration
Edit or create ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"dyngle": {
"command": "dyngle",
"args": ["mcp"]
}
}
}
Windows Configuration
Edit or create %APPDATA%/Claude/claude_desktop_config.json:
{
"mcpServers": {
"dyngle": {
"command": "dyngle",
"args": ["mcp"]
}
}
}
Specifying a Configuration File
Use a specific Dyngle configuration:
{
"mcpServers": {
"my-workflows": {
"command": "dyngle",
"args": ["--config", "/absolute/path/to/.dyngle.yml", "mcp"]
}
}
}
Important:
- Use absolute paths
- Restart Claude Desktop completely after editing (not just close the window)
- Tools appear in Claude's "Search and tools" interface
Design Considerations
Use Descriptions
Operations exposed via MCP should have clear descriptions:
dyngle:
operations:
deploy-staging:
description: Deploy the application to staging environment
steps:
- sub: build
- aws s3 sync ./dist s3://staging-bucket/
The description helps AI assistants understand when to use the tool.
Return Values for Tools
Operations used as tools should return meaningful values:
dyngle:
operations:
check-status:
description: Check deployment status
return: status
steps:
- curl -s "https://api.example.com/status" => status
This allows AI assistants to incorporate the result into their responses.
Private Operations for Secrets
Use private operations to protect sensitive operations:
dyngle:
operations:
get-credentials:
access: private
return: creds
steps:
- aws secretsmanager get-secret-value --secret-id api-creds => creds
make-api-call:
description: Call the API with authentication
steps:
- sub: get-credentials
=> creds
- curl -H "Authorization: {{creds}}" https://api.example.com/data
The get-credentials operation won't be exposed to AI assistants.
Example Use Cases
Development Workflow Tools
dyngle:
operations:
run-tests:
description: Run the test suite and report results
return: test-results
steps:
- pytest --json-report => results
- results -> jq '.summary' => test-results
check-coverage:
description: Check code coverage percentage
return: coverage
steps:
- pytest --cov=src --cov-report=json => cov-report
- cov-report -> jq '.totals.percent_covered' => coverage
Information Queries
dyngle:
operations:
get-latest-version:
description: Get the latest version from package.json
return: version
steps:
- cat package.json => pkg
- pkg -> jq -r '.version' => version
list-deployments:
description: List recent deployments
return: deployments
steps:
- aws s3api list-objects-v2 --bucket deployments --max-items 10 => deployments
System Operations
dyngle:
operations:
restart-service:
description: Restart the application service
return: status
steps:
- systemctl restart myapp => output
- echo "Service restarted successfully" => status
Troubleshooting
Server Not Showing Up
- Check JSON syntax - Validate
claude_desktop_config.json - Verify Dyngle in PATH - Run
which dyngle(macOS/Linux) orwhere dyngle(Windows) - Use full path - Try the full path to the dyngle executable in the
commandfield - Restart Claude Desktop - Use Cmd+Q (macOS) or quit from system tray (Windows)
Checking Logs (macOS)
Claude Desktop writes logs to ~/Library/Logs/Claude/:
tail -n 20 -f ~/Library/Logs/Claude/mcp*.log
Tool Execution Failures
- Test manually - Run the operation with
dyngle runfirst - Check arguments - Verify the AI assistant is passing correct data/args
- Review error field - Check the
{"error": "..."}response - Check MCP logs - Look for stdout/stderr in the MCP server logs
Operations Not Appearing
- Verify access - Only public operations are exposed
- Check description - Operations without descriptions still work but are harder for AI to discover
- Restart Claude - Changes require a full restart of Claude Desktop
Security Considerations
- Only expose safe operations - Be mindful of what operations you make available
- Use private operations - Mark sensitive operations as
access: private - Validate inputs - Operations should handle unexpected inputs gracefully
- Limit access - Consider which operations should be accessible to AI assistants
See Security for more details.
Next Steps
Operation Lifecycle
Understanding how Dyngle executes operations helps you design effective workflows.
Execution Steps
When you run an operation, Dyngle follows this lifecycle:
1. Load Configuration
Dyngle locates and loads the configuration file using a specific search order.
See Configuration for the complete search order and details.
2. Process Imports
If the configuration contains imports:, Dyngle loads the imported files recursively.
3. Load Input Data
If stdin is not a tty (e.g., data is piped in), Dyngle parses it as YAML and adds it to the data context.
echo "name: Alice" | dyngle run greet
4. Find the Operation
Dyngle looks up the named operation in the configuration. If not found, it returns an error.
5. Check Access Control
For direct execution via dyngle run, Dyngle verifies the operation is public. Private operations can only be called as sub-operations.
6. Initialize Data Context
Dyngle builds the initial data context by merging (in order of precedence):
- Global values from
dyngle: values: - Global expressions from
dyngle: expressions: - Operation-specific values
- Operation-specific expressions
- Input data from stdin
7. Execute Steps
For each step in the operation:
Template Rendering
Dyngle renders templates in the step, replacing {{variable}} syntax with values from the data context.
Parse Step
Dyngle parses the step to identify:
- Data input operator (
->) - Command and arguments
- Data assignment operator (
=>)
Execute Command
Dyngle runs the command using Python's subprocess.run():
- If there's a data input operator, the value is passed to stdin
- stdout and stderr are captured
- The exit code is checked
Update Data Context
If there's a data assignment operator, stdout from the command is stored in the data context with the specified name.
8. Handle Return Value
If the operation has a return: key:
- Dyngle looks up the specified value in the data context
- For
dyngle run, it formats and displays the value - For sub-operations, the value is made available to the parent
- For MCP tools, the value is returned as
{"result": <value>}
Data Context Evolution
The data context changes throughout execution:
dyngle:
values:
base: Initial
operations:
example:
values:
local: Added
expressions:
computed: "'Calculated'"
steps:
- echo "first" => step1 # Adds 'step1' to context
- echo "second" => step2 # Adds 'step2' to context
Data context evolution:
- Start:
{base: "Initial"} - After init:
{base: "Initial", local: "Added", computed: "Calculated"} - After step 1:
{base: "Initial", local: "Added", computed: "Calculated", step1: "first"} - After step 2:
{base: "Initial", local: "Added", computed: "Calculated", step1: "first", step2: "second"}
Sub-operation Lifecycle
When a step calls a sub-operation, scoping rules determine what data is shared between parent and child operations.
See Sub-operations for detailed explanation of declared values vs live data scoping.
Error Handling
Command Failures
If a command returns a non-zero exit code:
- Dyngle stops execution
- stderr is displayed
- The operation returns an error
- For MCP tools, returns
{"error": "<message>"}
Template Errors
If a template references an undefined variable:
- Dyngle raises an error
- Execution stops
- The error message identifies the missing variable
Access Control Errors
If you try to run a private operation directly:
- Dyngle checks the
access:attribute - If private, returns an error immediately
- Execution never starts
Performance Considerations
Command Execution
Each step spawns a new subprocess:
- Minimal overhead for simple commands
- Consider batching operations when possible
Template Rendering
Templates are rendered for each step:
- Efficient for small datasets
- Large data structures are passed by reference in the context
Data Flow
Data captured with => is held in memory:
- Suitable for typical command output
- Be cautious with very large outputs (e.g., large files)
Next Steps
Security
Dyngle is designed for workflow automation, but understanding its security characteristics is important for safe usage.
Command Execution Model
No Shell Interpretation
Commands are executed directly using Python's subprocess.run() with arguments parsed in a shell-like fashion:
Safe:
- Arguments are passed as separate parameters to the subprocess
- No shell meta-characters are interpreted (
|,>,&&,;, etc.) - Environment variables are not automatically expanded
This reduces common shell injection risks:
dyngle:
operations:
safe-example:
- echo "User input: {{user-input}}"
- curl "{{url}}"
Even if user-input contains shell meta-characters, they won't be interpreted as shell commands.
Template Substitution
Templates are evaluated before commands are executed:
dyngle:
operations:
example:
- echo "Hello {{name}}"
Security implications:
- Template values are substituted as strings
- Values from untrusted sources could still inject command arguments
- Validate and sanitize external inputs before use
Configuration Security
Not Robust to Malicious Configuration
Important: Dyngle is not designed to be robust against malicious configuration files.
- Configuration files contain Python expressions that are evaluated
- Expressions have access to system capabilities (file operations, networking, etc.)
- Only use configuration files from trusted sources
Never run Dyngle with untrusted configuration files.
Configuration File Permissions
Protect your configuration files:
# Set appropriate permissions
chmod 600 ~/.dyngle.yml
chmod 600 .dyngle.yml
This is especially important if your configuration contains:
- Credentials or API keys
- Sensitive operation logic
- Private operations that manage secrets
Expression Evaluation
Python Execution Context
Expressions are evaluated in a restricted Python context:
Available:
- Read-only operations (mostly)
- Standard library modules (
datetime,math, etc.) - Limited file system access via
Path()(current directory only)
Security notes:
- Expressions can still perform many operations
- The restriction is not a security sandbox
- Malicious expressions could cause harm
Path() Restrictions
The Path() function in expressions is restricted to the current working directory:
dyngle:
expressions:
config: "Path('.dyngle.yml')" # OK
# bad: "Path('/etc/passwd')" # Error
This prevents expressions from accessing arbitrary filesystem locations.
MCP Server Security
Tool Exposure
When running as an MCP server, operations become accessible to AI assistants:
Public operations:
- Exposed as tools
- Can be called by AI assistants
- May be called with unexpected inputs
Private operations:
- Not exposed via MCP
- Cannot be called by AI assistants
- Useful for sensitive operations
Best Practices for MCP
- Use access control:
dyngle:
operations:
get-secret:
access: private # Not exposed
return: secret
steps:
- aws secretsmanager get-secret-value --secret-id api-key => secret
public-operation:
description: Safe operation
steps:
- sub: get-secret
=> secret
- curl -H "Authorization: {{secret}}" https://api.example.com
- Validate inputs:
dyngle:
operations:
deploy:
description: Deploy to environment
expressions:
valid-env: "args[0] if args[0] in ['staging', 'production'] else None"
steps:
- echo "Deploying to {{valid-env}}"
- Limit operation scope:
- Don't expose operations that can modify critical systems
- Consider read-only operations for MCP exposure
- Use private sub-operations for sensitive steps
Secrets Management
Avoid Hardcoded Secrets
Bad:
dyngle:
values:
api-key: secret-key-123 # Never do this
Better:
dyngle:
operations:
get-api-key:
access: private
return: key
steps:
- aws secretsmanager get-secret-value --secret-id api-key => secret
- secret -> jq -r '.SecretString' => key
Use Private Operations
Operations that handle secrets should be private:
dyngle:
operations:
fetch-credentials:
access: private
return: creds
steps:
- # Fetch from secure source
authenticated-operation:
steps:
- sub: fetch-credentials
=> creds
- # Use credentials
This prevents:
- Direct execution via
dyngle run - Exposure via
dyngle list-operations - Access through MCP server
Input Validation
Data from stdin
Data piped to operations should be validated:
dyngle:
operations:
process-user-data:
expressions:
validated-email: "email if '@' in email else '[email protected]'"
steps:
- echo "Processing: {{validated-email}}"
Command Arguments
Arguments passed to operations should be validated:
dyngle:
operations:
deploy:
expressions:
environment: "args[0] if args and args[0] in ['dev', 'staging', 'prod'] else None"
steps:
- echo "Deploying to {{environment}}"
General Best Practices
-
Principle of Least Privilege:
- Only expose necessary operations publicly
- Use private operations for internal logic
- Limit MCP tool exposure
-
Configuration Management:
- Store configurations in version control (without secrets)
- Use separate configs for different environments
- Protect configuration files with appropriate permissions
-
Secret Handling:
- Never hardcode secrets
- Use secret management services (AWS Secrets Manager, etc.)
- Keep secret-handling operations private
-
Input Validation:
- Validate all external inputs
- Use expressions to sanitize data
- Set reasonable defaults
-
Audit and Monitor:
- Review operations before exposing via MCP
- Monitor operation execution in production
- Keep audit logs when appropriate
Reporting Security Issues
If you discover a security vulnerability in Dyngle, please report it through the appropriate channels rather than creating public issues.