Dyngle
⚠️ DISCLAIMER: This is a hobby/personal project. Not a commercial product. Not for production use.
Run lightweight local workflows
An experimental, lightweight, easily configurable workflow engine for automating development, operations, data processing, and content management tasks.
Technical Foundations
- Configuration, task definition, and flow control in YAML - Define your workflows in declarative YAML files
- Operations as system commands - Use familiar shell-like syntax for executing commands
- Expressions and logic in pure Python - Leverage Python for dynamic values and logic
Key Features
- Simple Operation Definition - Define workflows as arrays of system commands (learn more)
- Data Flow Operators - Pipe data between steps with
=>and->(learn more) - Conditional Steps - Execute steps based on boolean conditions with
if/then/else(learn more) - Template Substitution - Use
{{variable}}or$variablesyntax to inject data into commands (learn more) - Python Expressions - Evaluate Python code for dynamic values (learn more)
- Sub-operations - Compose operations from other operations (learn more)
- MCP Server Support - Expose operations as tools for AI assistants (learn more)
Use Cases
Dyngle is designed for:
- Development workflow automation (build, test, deploy)
- Operations tasks (server management, monitoring)
- Data processing pipelines
- Content management workflows
- AI assistant tool integration
Read more
Quick Start
Get up and running with Dyngle in minutes.
Installation
Dyngle requires Python 3.13 or later.
On MacOS, try:
brew install pipx
Then:
pipx install dyngle
On Debian with Python already installed, try:
python3.13 -m pip install pipx
pipx install dyngle
In containers:
pip install dyngle
After installation, verify that Dyngle is working:
dyngle --help
You should see the command-line help output.
Getting Started
Create a file called .dyngle.yml in your current directory:
dyngle:
operations:
hello:
- echo "Hello world"
Run the operation:
dyngle run hello
You should see:
Hello world
Referencing CLI arguments
Operations can reference arguments passed in to the run command. Update your .dyngle.yml:
dyngle:
operations:
hello:
- echo "Hello {{runtime.args.0}}!"
Run the operation:
dyngle run hello 'Ada'
You should see:
Hello Ada!
Referencing structured data
Command line arguments provide some convenience, but many operations require a more complete data structure. In its simplest form, input data handling can be tried by piping YAML text to the run command.
dyngle:
operations:
hello:
- echo "Hello {{name}}!"
Run the operation:
echo "name: Katherine Johnson" | dyngle run hello
Output:
Hello Katherine Johnson!
Specifying an input data schema
With structured input data, it's possible to specify a schema which is validated when the operation runs. Use accepts: for that, and also move the commands a level down into a steps: entry.
dyngle:
operations:
hello:
accepts:
full-name:
required: true
steps:
- echo "Hello {{full-name}}!"
This will fail:
echo "name: Jane" | dyngle run hello
But of course this works:
echo "full-name: Jane Goodall" | dyngle run hello
Performing logic with Python
Dyngle supports expressions that use Python, using a limited set of read-only operations.
The basic idea: Python for calculating, steps for doing things.
dyngle:
operations:
hello:
accepts:
first-name:
last-name:
required: true
expressions:
full-name: (first_name if first_name else 'Ms.') + ' ' + last_name
steps:
- echo "Hello {{full-name}}!"
echo "last-name: Curie" | dyngle run hello
Hello Ms. Curie!
Creating AI tools
Dyngle includes an MCP server to run its operations. To use it, we need to direct the output from the operation to a variable, and return the variable, so the operation becomes like a function.
dyngle:
operations:
hello:
accepts:
first-name:
last-name:
expressions:
full-name: (first_name if first_name else 'Mx.') + ' ' + last_name
steps:
- echo "{{full-name}} says 'nice to meet you'." => greeting
returns: greeting
To try it in Claude Desktop, edit or create ~/Library/Application Support/Claude/claude_desktop_config.json (or equivalent in OS's other than MacOS).
{"mcpServers": {"dyngle": {"command": "dyngle", "args": ["--config", "/absolute/path/to/your/project/.dyngle.mcp.yml", "mcp"]}}}
Then try a prompt:
Say hello to Jennifer Doudna using the Dyngle tool.
Read more
Operations
Operations are the fundamental building blocks in Dyngle. An operation is a named sequence of steps that execute commands, and/or a set of expressions to evaluate in realtime.
Basic Structure
Operations are defined under dyngle: in the configuration. The simplest form is a YAML array of command steps:
dyngle:
operations:
hello:
- echo "Hello world"
build:
- npm install
- npm run build
Run an operation:
dyngle run hello
Operation Definition
When you need additional attributes beyond just steps, use the extended form with a steps: key:
dyngle:
operations:
build:
description: Build the project for production
access: public
returns: build-info
expressions:
build-time: "datetime.now()"
steps:
- npm install
- npm run build
- echo "{{build-time}}" => build-info
Omitting steps:
The steps: attribute is optional if all you need is expressions. This is convenient for simple operations that compute and return values without executing commands:
dyngle:
operations:
get-timestamp:
returns: timestamp
expressions:
timestamp: "dtformat(datetime.now(), '%Y-%m-%d %H:%M:%S')"
Note that there are two kinds of steps - details in later sections:
- Command steps (the default) - Execute system commands
- Sub-operation steps - Call other operations
Operation Attributes
description
An optional description that appears in dyngle list-operations output:
dyngle:
operations:
deploy:
description: Deploy to production
steps:
- sub: build
- aws s3 sync ./dist s3://my-bucket/
access
Controls visibility and usage of operations.
public (default)
Public operations can be:
- Run directly via
dyngle run - Exposed as tools through the MCP server
- Listed in
dyngle list-operationsoutput
dyngle:
operations:
deploy:
access: public # Explicitly public (default if omitted)
description: Deploy to production
steps:
- sub: build
- aws s3 sync ./dist s3://my-bucket/
If access: is not specified, operations default to public.
private
Private operations can only be called as sub-operations by other operations. They cannot be:
- Run directly via
dyngle run(will fail with an error) - Exposed through the MCP server
- Listed in
dyngle list-operationsoutput
dyngle:
operations:
build:
access: private
steps:
- npm install
- npm run build
deploy:
steps:
- sub: build # OK - called as sub-operation
- aws s3 sync ./dist s3://my-bucket/
Use private operations for:
- Helper operations that shouldn't be run directly
- Operations that handle secrets
- Internal implementation details
- Components of larger workflows
returns
Specifies what value to return. See Output modes for details.
dyngle:
operations:
get-temperature:
returns: temp
steps:
- curl -s "https://api.example.com/weather" => weather-data
- weather-data -> jq -r '.temperature' => temp
accepts
Defines a schema to validate and default input data. When specified, inputs (from stdin, send:, or JSON via MCP) are validated before execution.
Basic example:
dyngle:
operations:
greet-user:
accepts:
name: { type: string }
age: { type: integer }
steps:
- echo "Hello {{name}}, age {{age}}"
The accepts: attribute creates a contract for your operation, ensuring it receives the expected data structure. See Inputs and interfaces for complete syntax and validation details.
expressions
Local expressions available only within this operation. See Constants and expressions for details.
dyngle:
operations:
greet:
expressions:
greeting: "'Hello ' + name + '!'"
steps:
- echo "{{greeting}}"
values
Local values (constants) available only within this operation. See Constants and expressions for details.
dyngle:
operations:
deploy:
constants:
environment: production
region: us-west-2
steps:
- echo "Deploying to {{environment}} in {{region}}"
Operation Inputs
Inputs are values that enter an operation from outside. There are three ways inputs are provided to an operation:
Via stdin (in the run command)
Pass YAML data through stdin:
echo "name: Alice\nage: 30" | dyngle run greet-user
Via send: (in sub-operations)
Parent operations can pass data to sub-operations:
dyngle:
operations:
child:
accepts:
name: { type: string }
steps:
- echo "Hello {{name}}"
parent:
constants:
user-data:
name: Bob
steps:
- sub: child
send: user-data
See Sub-operations for details.
Via JSON (through MCP)
When operations are called through the MCP server, inputs are provided as JSON:
{
"tool": "greet-user",
"name": "Alice",
"age": 30
}
See MCP Server for details.
Examples
Development workflow
dyngle:
operations:
test:
description: Run the test suite
steps:
- pytest --cov=src tests/
- coverage report
lint:
description: Check code style
steps:
- black --check src/
- flake8 src/
ci:
description: Run all checks
steps:
- sub: test
- sub: lint
Data processing
dyngle:
operations:
process-data:
returns: result
steps:
- curl -s "https://api.example.com/data" => raw-data
- raw-data -> jq '.items' => filtered
- filtered -> python process.py => result
Next Steps
- Learn about command steps
- Understand operation context
- Explore inputs and interfaces
- Review best practices
Command Steps
Command steps are the default type of operation step in Dyngle. They execute system commands directly, without using a shell.
Not a Shell
Important: Operation steps use shell-like syntax but are not executed in a shell. Commands are parsed and executed directly by Python's subprocess.run().
This means shell-specific features won't work:
Doesn't work (shell syntax):
- echo "Hello" | grep Hello # Use data flow operators instead
- export VAR=value # Use expressions or values instead
- ls > files.txt # Use data flow operators instead
- cd /some/path && ls # Each step is independent
Works (Dyngle syntax):
- echo "Hello world"
- npm install
- python script.py --arg value
- curl -s "https://api.example.com/data"
Use Dyngle's features for data flow, variable substitution, and composability instead of shell features.
Template Syntax
Use double-curly-bracket syntax ({{ and }}) to inject values from the operation context into commands:
dyngle:
operations:
hello:
- echo "Hello {{name}}!"
Templates are rendered before the command executes, replacing {{variable}} with the value from the operation context.
Using Templates
The initial context used for command steps may come from YAML over stdin to the Dyngle run command.
echo "name: Francis" | dyngle run hello
Output:
Hello Francis!
Nested Properties
The double-brackets may contain a context path, using dot-separated strings and ints to navigate nested dicts and lists.
dyngle:
operations:
weather-report:
steps:
- curl -s "https://api.example.com/weather" => weather
- 'echo "Temperature: {{weather.temperature}}"'
- 'echo "Location: {{weather.location.city}}, {{weather.location.country}}"'
dyngle:
constants:
python-command: /bin/python3
expressions:
unittest-command: format('{{python-command}}) -m unittest').split()
operations:
test:
- '{{unittest-command.0}} {{unittest-command.1}} {{unittest-command.2}} test/*'
YAML type inference
YAML type inference impacts how Dyngle configurations are parsed. For example, if a string contains a colon, the YAML parser might interpret the content to the left of the colon as an object key.
- echo 'Temperature: {{temperature}}' => data # YAML parsing error!
One solution is to enclose the entire string entry in single or double quotes.
- "echo 'Temperature: {{temperature}}' => data"
Another is to use a YAML multiline string.
- >-
echo 'Temperature: {{temperature}}' => data
Remember that the YAML parser sends the entire string to Dyngle to parse, so the command and data flow operators must all occupy the same string entry in YAML.
Data Flow Operators
Dyngle provides two operators to pass data between steps.
The Send Operator (->)
The send operator passes a value from the operation context as stdin to a command:
dyngle:
operations:
process-data:
steps:
- curl -s "https://api.example.com/data" => raw-data
- raw-data -> jq '.items' => filtered
- filtered -> python process.py
The value is passed to the command's standard input.
Automatic YAML Conversion
When using the send operator, values are automatically converted to YAML format before being passed to stdin:
- Strings are passed as-is without modification
- Dicts and lists are serialized to YAML format
- Numbers and other types are converted to their string representation
This makes it easy to pass structured data to commands that expect YAML input:
dyngle:
operations:
process-config:
constants:
config:
host: localhost
port: 8080
enabled: true
steps:
- config -> python process_config.py
The command receives:
host: localhost
port: 8080
enabled: true
The Receive Operator (=>)
The receive operator captures stdout from a command and assigns it to a named variable in the operation context:
dyngle:
operations:
fetch-data:
steps:
- curl -s "https://api.example.com/users" => users
- echo "Fetched: {{users}}"
The captured value becomes available for subsequent steps.
Combining Operators
You can use both operators in a single step:
<input-variable> -> <command> => <output-variable>
Example:
dyngle:
operations:
weather:
steps:
- curl -s "https://api.example.com/weather" => weather-data
- weather-data -> jq -j '.temperature' => temp
- echo "Temperature: {{temp}} degrees"
Operator Order
When using both operators, they must appear in this order:
- Send operator (
->) first - Command in the middle
- Receive operator (
=>) last
Operator Spacing
Operators must be isolated with whitespace:
Correct:
- command => output
- input -> command
- input -> command => output
Incorrect:
- command=>output # Missing spaces
- input->command # Missing spaces
Siilarity to Sub-operations
Note the similarity between these operators in command steps and the send: and receive: attributes in sub-operation steps:
Command step:
- input-data -> command => output-data
Sub-operation step:
- sub: operation-name
send: input-data
receive: output-data
Both follow the same pattern of sending input and receiving output. See Sub-operations for details.
Practical Examples
API Data Processing
dyngle:
operations:
get-user-emails:
returns: emails
steps:
- curl -s "https://api.example.com/users" => users
- users -> jq -r '.[].email' => emails
Multi-step Pipeline
dyngle:
operations:
analyze-logs:
returns: summary
steps:
- curl -s "https://logs.example.com/today" => logs
- logs -> grep "ERROR" => errors
- errors -> wc -l => error-count
- echo "Found {{error-count}} errors" => summary
Data Transformation
dyngle:
operations:
transform-json:
returns: result
steps:
- cat input.json => raw
- raw -> jq '.data | map({id, name})' => transformed
- transformed -> python format.py => result
Important Notes
Variables Created with =>
Values populated with the receive operator (=>) have the highest precedence in the operation context. They override values, expressions, and inputs with the same name.
See Operation context for complete precedence rules.
Working with Structured Data
When data flow captures structured data (JSON, YAML), use a context path to access nested properties in templates:
dyngle:
operations:
weather:
steps:
- curl -s "https://api.example.com/weather" => weather-data
- 'echo "Temperature: {{weather-data.temperature}}"'
- 'echo "City: {{weather-data.location.city}}"'
dyngle:
operations:
process-items:
expressions:
items: from_json(get('items-json'))
steps:
- curl -s "https://api.example.com/items" => items-json
- echo 'The first item is called {{items.0.name}}'
Using Expressions with Data Flow
You can reference captured data in expressions:
dyngle:
operations:
process:
expressions:
message: "format('Processed {{count}} items')"
steps:
- curl -s "https://api.example.com/items" => items
- items -> jq 'length' => count
- echo "{{message}}"
Prompt Steps
Prompt steps allow operations to pause and wait for user input, even when the operation receives data via stdin.
Basic Syntax
Use the prompt: key to display a message and wait for user input:
dyngle:
operations:
wait-for-user:
steps:
- prompt: "Press enter to continue"
- echo "Continuing..."
Capturing Input
Use the receive: attribute to capture the user's input into a variable:
dyngle:
operations:
get-name:
steps:
- prompt: "Enter your name: "
receive: user-name
- echo "Hello {{user-name}}!"
The captured input becomes available in the operation context for use in subsequent steps.
Template Support
Prompt messages support template substitution:
dyngle:
constants:
app-name: "MyApp"
operations:
welcome:
steps:
- prompt: "Welcome to {{app-name}}! Press enter to start"
receive: confirm
- echo "Starting {{app-name}}..."
How It Works
Prompt steps use WizLib's UI handler to access the terminal (TTY) directly. This means prompts work even when stdin is redirected for passing data to the operation:
echo "config: value" | dyngle run my-operation
The operation can receive YAML data via stdin AND prompt the user for input during execution.
Use Cases
Interactive confirmation:
dyngle:
operations:
deploy:
steps:
- echo "About to deploy to production"
- prompt: "Type 'yes' to confirm: "
receive: confirmation
- echo "Deploying..." # Only if user confirmed
Collecting user input:
dyngle:
operations:
setup:
steps:
- prompt: "Enter project name: "
receive: project-name
- prompt: "Enter author name: "
receive: author
- echo "Creating {{project-name}} by {{author}}"
Pausing for review:
dyngle:
operations:
analyze:
steps:
- curl -s "https://api.example.com/data" => data
- echo "{{data}}"
- prompt: "Review the data above. Press enter to continue"
- data -> python process.py
Important Notes
TTY Required: Prompt steps require a TTY (interactive terminal). They will fail in non-interactive environments like cron jobs or CI/CD pipelines unless those environments provide TTY access.
stdin vs TTY: Operations can receive structured data via stdin (for YAML input) and still prompt users interactively. These are independent mechanisms:
- stdin: For passing data to operations
- TTY: For interactive user prompts
receive: Behavior:
When using receive:, the captured input:
- Can be an empty string if the user just presses enter
- Becomes available immediately in the operation context
- Has the same precedence as other variables created during execution
Conditional Steps
Conditional steps allow operations to execute different steps based on boolean conditions.
Basic Syntax
Use if:, then:, and optionally else: to create conditional blocks:
dyngle:
operations:
deploy:
expressions:
is-production: "environment == 'production'"
steps:
- if: is-production
then:
- echo "Deploying to production"
else:
- echo "Deploying to staging"
Structure
if:- References a value from the context to evaluate as booleanthen:- List of steps to execute when the condition is trueelse:- (Optional) List of steps to execute when the condition is false
Condition Evaluation
The value referenced by if: is evaluated using Python's truthiness rules:
True, non-zero numbers, non-empty strings/lists/dicts evaluate to trueFalse,None,0, empty strings/lists/dicts evaluate to false
The condition is evaluated in the operation's context, which includes:
- Operation inputs (from stdin,
send:, or MCP) - Global constants and expressions
- Local constants and expressions
- Variables captured from previous steps (via
=>orreceive:)
Examples
Basic conditional:
dyngle:
operations:
check-env:
expressions:
is-prod: "env == 'production'"
steps:
- if: is-prod
then:
- echo "Production environment"
else:
- echo "Non-production environment"
Without else block:
dyngle:
operations:
optional-cleanup:
expressions:
should-cleanup: "cleanup_enabled"
steps:
- echo "Running main task"
- if: should-cleanup
then:
- echo "Cleaning up"
- rm -rf temp/
With variables from previous steps:
dyngle:
operations:
check-status:
expressions:
is-healthy: "status_code == '200'"
steps:
- curl -s -o /dev/null -w "%{http_code}" "{{url}}" => status-code
- if: is-healthy
then:
- echo "Service is healthy"
else:
- echo "Service is unhealthy"
With sub-operations:
dyngle:
operations:
validate:
steps:
- echo "data"
process:
expressions:
needs-validation: "validate_required"
steps:
- if: needs-validation
then:
- sub: validate
send: data
receive: validated
- echo "Validated: {{validated}}"
else:
- echo "Skipping validation"
Nested conditionals:
dyngle:
operations:
deploy:
expressions:
is-production: "environment == 'production'"
requires-approval: "deployment_type == 'major'"
steps:
- if: is-production
then:
- if: requires-approval
then:
- prompt: "Type 'yes' to approve: "
receive: approval
- echo "Deploying with approval"
else:
- echo "Deploying without approval"
else:
- echo "Deploying to staging"
Variable Scope
Variables created within then: or else: blocks (via => or receive:) are available to subsequent steps in the operation, regardless of which branch executed:
dyngle:
operations:
example:
expressions:
use-default: "value == ''"
steps:
- if: use-default
then:
- echo "default" => result
else:
- echo "{{value}}" => result
- echo "Result: {{result}}" # Available here
All Step Types Supported
Conditional blocks support all step types:
- Command steps (with
->and=>operators) - Sub-operation steps (with
sub:,send:,receive:) - Prompt steps (with
prompt:,receive:) - Nested conditional steps
Error Handling
If the value referenced by if: does not exist in the operation context, the operation will fail with an error indicating the missing reference.
Next Steps
Operation Context
The operation context is all of the information available within an operation for use in templates, expressions, return:, and other places. Understanding the context is key to building effective operations.
What is Context?
Context encompasses all named values accessible within an operation, regardless of how they're defined or populated. The context includes:
- Constants - Defined with
constants: - Expressions - Defined with
expressions: - Inputs - Data entering the operation from outside
- Variables - Values set during execution with
receive:or=>
Inputs
Inputs are values that enter the operation from outside sources. There are three ways inputs are provided:
Via stdin (in the run command)
Pass YAML data through stdin:
echo "name: Alice" | dyngle run hello
Via send: (in sub-operations)
Parent operations pass data to sub-operations:
dyngle:
operations:
child:
steps:
- echo "Hello {{name}}"
parent:
constants:
data:
name: Bob
steps:
- sub: child
send: data # 'name' becomes an input to child
Via JSON (through MCP)
When called through the MCP server, inputs are provided as JSON:
{
"tool": "greet-user",
"name": "Alice"
}
Via command-line arguments
Command-line arguments passed to dyngle run are available as the runtime.args list:
dyngle run greet Alice Bob Charlie
Note that runtime.args is not available directly to sub-commands; values must be passed explicitly.
Variables
Variables are values set during operation execution. They're created in two ways:
Using the receive operator (=>)
Capture command stdout:
dyngle:
operations:
fetch-data:
steps:
- curl -s "https://api.example.com/users" => users # 'users' is a variable
- echo "Fetched: {{users}}"
Using receive: in sub-operations
Capture return values from sub-operations:
dyngle:
operations:
get-version:
returns: ver
steps:
- cat package.json -> jq -r '.version' => ver
tag-release:
steps:
- sub: get-version
receive: version # 'version' is a variable
- git tag "v{{version}}"
Variables have the highest precedence in the context (see below).
Constants and Expressions from Configuration
The context also includes values and expressions defined in your configuration:
Global
dyngle:
constants:
environment: production
expressions:
timestamp: "datetime.now()"
operations:
deploy:
steps:
- echo "Deploying to {{environment}} at {{timestamp}}"
Local (operation-specific)
dyngle:
operations:
deploy:
constants:
region: us-west-2
expressions:
build-time: "datetime.now()"
steps:
- echo "Deploying to {{region}} at {{build-time}}"
See Constants and expressions for details.
Context Key Precedence
When names overlap, Dyngle resolves them using this precedence (highest to lowest):
- Variables (populated via
=>operator orreceive:) within the operation - Local expressions and constants (defined in the operation)
- Global expressions and constants (defined under
dyngle:) - Inputs (from stdin,
send:, or MCP JSON)
(Expressions and constants together are also called "declarations".)
Example:
dyngle:
constants:
name: Global
expressions:
name: "'Expression'"
operations:
test:
constants:
name: Local
steps:
- echo "Start: {{name}}" # "Local" (local constant wins)
- echo "Override" => name
- echo "After: {{name}}" # "Override" (variable wins)
Real-time Evaluation
Expressions are evaluated in real-time based on the current state of the context. This means:
- When an expression references another value, it uses the current value at evaluation time
- Variables created during execution are immediately available to subsequent expressions
- Changes to the context during execution affect expression results
Example:
dyngle:
operations:
demo:
expressions:
message: "format('Count is {{count}}')"
steps:
- echo "5" => count
- echo "{{message}}" # "Count is 5"
- echo "10" => count
- echo "{{message}}" # "Count is 10"
Accessing Context Values
Everything in the context can be referenced using context paths in these places:
In Templates
Use {{variable}} syntax in command steps:
- echo "Hello {{name}}!"
- 'echo "Temperature: {{weather.temperature}}"'
In Expressions with get()
Use the get() function to access context constants:
dyngle:
expressions:
full-greeting: "'Hello ' + get('name')"
temp: get('weather.temperature')
In returns:
Specify which context value to return:
dyngle:
operations:
get-temperature:
returns: temp
steps:
- curl -s "https://api.example.com/weather" => weather-data
- weather-data -> jq -r '.temperature' => temp
The returns: key can reference:
- Variables (set via
=>orreceive:) - Constants (from
constants:) - Expressions (from
expressions:) - Inputs (from stdin,
send:, or MCP)
Essentially, returns: can access anything in the operation context using the same key names.
Nested Object Properties
Access nested properties in dictionaries using context paths:
dyngle:
operations:
weather-report:
steps:
- curl -s "https://api.example.com/weather" => weather
- 'echo "Temperature: {{weather.temperature}}"'
- 'echo "Location: {{weather.location.city}}, {{weather.location.country}}"'
Working with Arrays
For arrays, use expressions to extract constants:
dyngle:
constants:
users:
- name: Alice
email: [email protected]
- name: Bob
email: [email protected]
operations:
show-users:
expressions:
first-user: get('users')[0]
first-name: get('users')[0]['name']
all-names: "[u['name'] for u in get('users')]"
steps:
- echo "First user: {{first-name}}"
Context Scope
Each operation maintains its own context. When using sub-operations:
- Parent and child contexts are isolated by default
- Use
send:to explicitly pass data to a child - Use
receive:to explicitly capture data from a child
See Sub-operations for details on context scope with sub-operations.
Example: Context Evolution
Here's how context evolves during execution:
dyngle:
constants:
base: Global constant
expressions:
computed: "'Global expression'"
operations:
demo:
constants:
local: Local constant
expressions:
derived: "'Derived: ' + get('local')"
steps:
- echo "{{base}}" # Global constant
- echo "{{computed}}" # Global expression
- echo "{{local}}" # Local constant
- echo "{{derived}}" # Local expression
- echo "Runtime" => dynamic # Creates variable
- echo "{{dynamic}}" # Variable
Context at different points:
- Start:
{base, computed, local, derived} - After step 5:
{base, computed, local, derived, dynamic}
Next Steps
Inputs and Interfaces
The accepts: attribute defines a schema for validating and defaulting operation inputs. This creates a clear contract for what data your operation expects.
Interface Definition
Define an interface using the accepts: key in your operation:
dyngle:
operations:
greet-user:
accepts:
name: { type: string }
age: { type: integer }
steps:
- echo "Hello {{name}}, age {{age}}"
When inputs are provided (via stdin, send:, or MCP JSON), Dyngle validates them against this schema before executing the operation.
Basic Syntax
The interface syntax is loosely based on JSON Schema but with adaptations tailored for Dyngle's use cases. While it shares some conventions (like type names), it has its own behavior for required fields, defaults, and type inference.
Root Object
The interface always defines properties of a root object:
accepts:
field1: { type: string }
field2: { type: integer }
This expects input like:
field1: some text
field2: 42
Type System
Supported types:
- string - Text values
- integer - Whole numbers (no decimals)
- number - Any numeric value (integers and floats)
- boolean - True or false values
- array - Lists of items
- object - Nested structures with properties
Type Examples
dyngle:
operations:
demo:
accepts:
text: { type: string }
count: { type: integer }
price: { type: number }
enabled: { type: boolean }
tags:
type: array
items: { type: string }
config:
type: object
properties:
host: { type: string }
port: { type: integer }
Type Inference
Types can be inferred automatically based on other attributes:
accepts:
name: {} # Defaults to string
message: # Also defaults to string (YAML null value)
user:
properties:
email: { type: string } # Inferred as object from properties
tags:
items: { type: string } # Inferred as array from items
Simplified syntax: You can omit the field value entirely (YAML null) or use an empty dict {} for string fields with blank defaults. These are equivalent:
accepts:
name: # YAML null - string with blank default
title: {} # Empty dict - string with blank default
email: { type: string } # Explicit string type
All three patterns above create optional string fields with blank string defaults.
Precedence for type inference:
- Explicit
type:if declared objectifproperties:is presentarrayifitems:is present- Otherwise
string
Required Fields and Defaults
Field requirements vary by type:
- String fields without explicit
requiredordefaultget a blank string""as the default (making them optional) - Other types are required by default unless marked
required: falseor given adefault
Examples
accepts:
name: {} # String type, gets blank default "" if omitted
nickname: { type: string, required: false } # Also optional, no default
age: { type: integer } # Required (non-string type)
email: { type: string, required: true } # Explicitly required
Default Values
Provide explicit default values for optional fields:
accepts:
name: { type: string } # Gets blank string if omitted
country:
type: string
default: "US" # Gets "US" if omitted
port:
type: integer
default: 8080 # Gets 8080 if omitted
Having an explicit default makes the field optional automatically.
Nested Objects
Define nested structures with properties:. The object type is automatically inferred from the presence of properties:, so you don't need to declare it explicitly:
dyngle:
operations:
process-order:
accepts:
customer:
properties: # object type inferred
name: # string type inferred (YAML null)
email:
shipping:
properties: # object type inferred
address:
city:
zip:
steps:
- echo "Processing order for {{customer.name}}"
- echo "Shipping to {{shipping.city}}"
You can nest objects arbitrarily deep:
accepts:
order:
properties:
shipping:
properties:
address:
properties: # deeply nested object
street:
city:
state:
Arrays
Define array types with items:
accepts:
tags:
type: array
items: { type: string }
scores:
items: { type: integer } # type: array inferred from items
Arrays of Objects
Combine arrays with nested objects. The array type is inferred from items:, and the object type for each item is inferred from properties::
accepts:
items:
items: # array type inferred
properties: # object type inferred for each item
product: # string type inferred
quantity: { type: integer }
price: { type: number }
Validation Process
When an operation with accepts: is invoked:
- Input is received (from stdin,
send:, or MCP JSON) - Schema validation - Input structure is checked against the interface
- Type validation - Values are checked for correct types
- Defaults applied - Missing optional fields get their defaults
- Required fields checked - Missing required fields cause an error
- Execution proceeds - Validated input becomes available in the operation context
Validation Success
echo "name: Alice\nage: 30" | dyngle run greet-user
# Output: Hello Alice, age 30
Validation Failure
echo "age: 30" | dyngle run greet-user
# Error: Input validation failed for operation 'greet-user':
# Field 'name' is required at root
Extra Fields
Extra fields are allowed by default:
echo "name: Bob\nage: 25\ncity: Seattle" | dyngle run greet-user
# Output: Hello Bob, age 25 (city is ignored)
Complete Example
This example demonstrates all the key features including type inference, nested objects, arrays, defaults, and required fields:
dyngle:
operations:
create-user:
description: Create a new user account
accepts:
username: # string with blank default (optional)
email: { type: string, required: true } # explicitly required
age: { type: integer } # required (non-string type)
role: { default: "user" } # string with custom default
preferences:
properties: # object type inferred
theme: { default: "light" }
notifications: { type: boolean, default: true }
tags:
items: # array type inferred
type: string
default: []
returns: result
steps:
- echo "Creating user {{username}} ({{email}})"
- echo "Role: {{role}}, Age: {{age}}"
- echo "Theme: {{preferences.theme}}"
- echo "User created successfully" => result
Using Interfaces with Sub-operations
When using send: to pass data to sub-operations, the data is validated against the child operation's accepts: schema. This example shows nested objects being passed via send::
dyngle:
operations:
process-user:
accepts:
user:
properties: # object type inferred
name: # string type inferred
email:
age: { type: integer }
steps:
- echo "Processing {{user.name}}, age {{user.age}}"
main:
constants:
user-data:
user:
name: Alice
email: [email protected]
age: 30
steps:
- sub: process-user
send: user-data # Nested structure validated against accepts schema
If validation fails, the parent operation stops with an error.
Using Interfaces with MCP
When operations are exposed via the MCP server, the accepts: schema determines the tool's input parameters:
With accepts:
dyngle:
operations:
get-weather:
description: Get current weather for a city
accepts:
city: { type: string }
units:
type: string
default: "metric"
returns: weather-info
steps:
- curl -s "https://api.example.com/weather?city={{city}}&units={{units}}" => weather-info
The MCP tool will have city and units as input parameters, with validation and defaults applied automatically.
Without accepts:
dyngle:
operations:
run-backup:
description: Run the nightly backup process
returns: result
steps:
- /usr/local/bin/backup.sh => result
The MCP tool will have no input parameters.
See MCP Server for more details.
Best Practices
Use accepts for Public Operations
Operations exposed via MCP or called as sub-operations should define their interfaces. Use the simplified syntax for cleaner definitions:
dyngle:
operations:
deploy-service:
description: Deploy a service to an environment
accepts:
service-name: # string type inferred
environment:
version:
steps:
- echo "Deploying {{service-name}} v{{version}} to {{environment}}"
Provide Sensible Defaults
Use defaults for optional parameters to make operations easier to use:
accepts:
environment:
type: string
default: "development"
verbose:
type: boolean
default: false
Document with Field Descriptions
Use description: on individual fields to document their purpose. Field descriptions are passed through to MCP tools and other dynamic contexts:
dyngle:
operations:
process-data:
description: Process data file with optional validation
accepts:
filename:
type: string
description: Path to the data file to process
format:
type: string
default: "json"
description: Data format (json or csv)
validate:
type: boolean
default: true
description: Whether to validate data before processing
Field descriptions are especially important for operations exposed via MCP or used as sub-operations, as they help users understand what each parameter does.
Next Steps
Constants, Expressions, and Templates
Constants, expressions, and templates allow you to define reusable values in your configuration. Constants are static values, expressions are dynamically evaluated using Python, and templates provide convenient string formatting.
Constants
Define constants using the constants: key. Constants are static values that don't change during execution.
Global Constants
Defined under dyngle: and available to all operations:
dyngle:
constants:
environment: production
region: us-west-2
api-url: https://api.example.com
operations:
deploy:
steps:
- echo "Deploying to {{environment}} in {{region}}"
Local Constants
Defined within a specific operation:
dyngle:
operations:
greet:
constants:
greeting: Hello
name: World
steps:
- echo "{{greeting}}, {{name}}!"
Local constants override global constants with the same name.
Expressions
Expressions are Python code snippets that compute dynamic values. They're evaluated in real-time using the Python interpreter with a controlled set of available functions and variables specific to Dyngle.
Basic Usage
Define expressions that evaluate to values:
dyngle:
operations:
greet:
expressions:
greeting: "'Hello ' + name + '!'"
steps:
- echo "{{greeting}}"
Run it:
echo "name: Alice" | dyngle run greet
Output:
Hello Alice!
Expressions vs Constants
An expression is like a constant that is evaluated dynamically:
dyngle:
constants:
static-time: "2024-01-01" # Always the same
expressions:
current-time: "datetime.now()" # Evaluated each time
Global Expressions
Defined under dyngle: and available to all operations:
dyngle:
expressions:
timestamp: "datetime.now()"
author: "'Francis Potter'"
operations:
log:
steps:
- echo "[{{timestamp}}] Log by {{author}}"
Local Expressions
Defined within a specific operation:
dyngle:
operations:
say-hello:
expressions:
count: "len(name)"
steps:
- echo "Hello {{name}}! Your name has {{count}} characters."
Local expressions override global expressions with the same name.
Templates
Templates provide a convenient way to define string templates without explicitly wrapping them in format() calls. They work like expressions but automatically apply template formatting to string values.
Basic Usage
Define templates that automatically format strings:
dyngle:
constants:
name: Alice
templates:
greeting: "Hello {{name}}!"
operations:
greet:
steps:
- echo "{{greeting}}"
This is equivalent to:
dyngle:
expressions:
greeting: "format('Hello {{name}}!')"
Templates vs Expressions
Templates are ideal when you primarily need string formatting:
dyngle:
constants:
cmd: "ls"
output: "result.txt"
templates:
# Automatically wrapped in format()
full-command: "{{cmd}} --output {{output}}"
expressions:
# Requires explicit format() call
full-command-expr: "format('{{cmd}} --output {{output}}')"
Both produce the same result, but templates are more concise for string formatting.
Global Templates
Defined under dyngle: and available to all operations:
dyngle:
constants:
app-name: MyApp
version: 1.0
templates:
banner: "=== {{app-name}} v{{version}} ==="
operations:
start:
steps:
- echo "{{banner}}"
Local Templates
Defined within a specific operation:
dyngle:
constants:
user: admin
operations:
login:
templates:
message: "Logging in as {{user}}..."
steps:
- echo "{{message}}"
Local templates override global templates with the same name.
Template Structures
Templates support complex data structures with automatic formatting:
Dictionary structures:
dyngle:
constants:
host: example.com
port: 8080
templates:
config:
server: "{{host}}"
url: "http://{{host}}:{{port}}"
operations:
show-config:
expressions:
server-info: "get('config')['server']"
steps:
- echo "{{server-info}}"
List structures:
dyngle:
constants:
env: production
templates:
commands:
- "echo Starting {{env}}"
- "echo Deploying to {{env}}"
operations:
deploy:
expressions:
first-cmd: "get('commands')[0]"
steps:
- "{{first-cmd}}"
Nested structures:
dyngle:
constants:
app: MyApp
version: 2.0
templates:
metadata:
info:
name: "{{app}}"
version: "{{version}}"
tags:
- "{{app}}-{{version}}"
- "latest"
Non-String Values
Non-string primitive values (integers, floats, booleans) in templates are treated as constants:
dyngle:
templates:
config:
name: "MyApp" # String - formatted
port: 8080 # Integer - constant
enabled: true # Boolean - constant
timeout: 30.5 # Float - constant
This allows you to mix formatted strings with static values in the same structure.
Template Syntax
Constants, expressions, and templates can be referenced in your configuration using template syntax. Dyngle supports two template syntaxes:
Double-Curly Syntax
The traditional syntax uses double curly braces:
dyngle:
constants:
name: Alice
server:
host: example.com
operations:
greet:
steps:
- echo "Hello {{name}}"
- echo "Server: {{server.host}}"
This syntax works everywhere and supports inline replacements within strings.
Dollar Syntax
An alternative syntax uses the dollar sign prefix:
dyngle:
constants:
name: Alice
server:
host: example.com
operations:
greet:
steps:
- echo $name
- echo $server.host
Important constraints:
- The
$variablesyntax only works when it's a complete word (after shell-style word splitting) - It does not work for inline replacements:
echo "Hello$name"will not be replaced - It does work in quotes when it's a separate word:
echo "$name"works - It supports the same features as
{{...}}: nested properties ($server.host), hyphens ($first-name), and numeric indices ($runtime.args.0)
When to use dollar syntax:
The dollar syntax is useful to avoid YAML parsing issues with curly braces in certain contexts. Both syntaxes can be used together in the same configuration:
dyngle:
constants:
greeting: Hello
name: World
operations:
greet:
steps:
- echo "$greeting, {{name}}!" # Both work together
Available Functions and Names
Expressions evaluate in a context that includes a subset of Python's standard features plus some Dyngle-specific functions. This controlled environment ensures expressions are powerful yet predictable.
Referencing Context Values
Values from the operation context can be referenced directly as Python variables:
dyngle:
operations:
greet:
expressions:
message: "'Hello ' + name"
steps:
- echo "{{message}}"
Hyphenated Names
YAML keys can contain hyphens. To reference them in expressions:
Option 1: Replace hyphens with underscores:
dyngle:
operations:
greet:
expressions:
message: "'Hello ' + first_name" # References 'first-name'
steps:
- echo "{{message}}"
Option 2: Use the get() function:
dyngle:
operations:
greet:
expressions:
message: "'Hello ' + get('first-name')"
steps:
- echo "{{message}}"
Dyngle-Specific Functions
get()
Retrieve values from the operation context by name:
dyngle:
expressions:
full-greeting: "'Hello ' + get('first-name') + ' ' + get('last-name')"
The get() function can also reference other expressions:
dyngle:
expressions:
greeting: "'Hello'"
full-greeting: "get('greeting') + ' ' + name"
format()
Render a template string using the current operation context:
dyngle:
values:
first-name: Alice
last-name: Smith
operations:
greet:
expressions:
full-greeting: "format('Hello, {{first-name}} {{last-name}}!')"
steps:
- echo "{{full-greeting}}"
The format() function supports all template syntax, including nested properties:
dyngle:
operations:
weather-report:
expressions:
report: "format('Temperature in {{location.city}} is {{weather.temperature}} degrees')"
steps:
- echo "{{report}}"
dtformat()
Format datetime objects as strings:
dyngle:
expressions:
now: "datetime.now()"
timestamp: "dtformat(get('now'), '%Y-%m-%d %H:%M:%S')"
operations:
log:
steps:
- echo "[{{timestamp}}] Event occurred"
PurePath()
Work with operating system paths:
dyngle:
operations:
git-dir:
expressions:
result: PurePath(cwd) / '.git'
steps:
- pwd => cwd
returns: result
Note that command steps can be used for I/O operations while using Python to manipulate paths and strings.
dyngle:
operations:
show-tests:
expressions:
test-dir: PurePath(cwd) / 'test'
test-files: '[f.strip() for f in test_files_text.split("\n")]'
result:
test-dir: get('test-dir')
test-files: get('test-files')
steps:
- pwd => cwd
- ls -1 {{test-dir}} => test-files-text
returns: result
Runtime Declarations
Runtime declarations are automatically provided values that reflect runtime execution context. Unlike constants and expressions defined in your configuration, these are set by the Dyngle runtime based on how the operation was invoked.
runtime.args
Command-line arguments passed to dyngle run are available under runtime.args:
dyngle:
operations:
greet:
expressions:
name: "get('runtime.args.0') or 'World'"
steps:
- echo "Hello {{name}}!"
Run with:
dyngle run greet Alice
Output:
Hello Alice!
Important notes:
runtime.argsis only available in theruncommand, not in MCP operations- Args are not automatically passed to sub-operations; use
send:to pass them explicitly - Access via
get('runtime.args.N')for safe access with defaults - Access via
runtime.args.Nin templates when you know the arg exists
The runtime namespace is reserved for future runtime-provided values (environment info, execution metadata, etc.).
Environment and System Functions
getenv()
Access environment variables:
dyngle:
operations:
show-env:
expressions:
home-dir: "getenv('HOME', '/default/path')"
api-key: "getenv('API_KEY') or 'not-set'"
steps:
- 'echo "Home directory: {{home-dir}}"'
- 'echo "API Key: {{api-key}}"'
getcwd()
Get the current working directory:
dyngle:
operations:
show-cwd:
expressions:
current-dir: "getcwd()"
parent-dir: "str(PurePath(getcwd()).parent)"
steps:
- 'echo "Working in: {{current-dir}}"'
- 'echo "Parent: {{parent-dir}}"'
Data Serialization Functions
to_json()
Convert Python data structures to JSON strings:
dyngle:
operations:
create-config:
expressions:
config-data:
server: "format('{{host}}')"
port: "int(get('port'))"
enabled: "True"
json-output: "to_json(get('config-data'))"
steps:
- echo "{{json-output}}"
from_json()
Parse JSON strings into Python data structures:
dyngle:
operations:
parse-json:
expressions:
parsed: "from_json(json_string)"
server-host: "get('parsed')['server']"
steps:
- 'echo "Server: {{server-host}}"'
to_yaml()
Convert Python data structures to YAML strings:
dyngle:
operations:
create-yaml:
expressions:
config-data:
database: "format('{{db-name}}')"
timeout: "30"
yaml-output: "to_yaml(get('config-data'))"
steps:
- echo "{{yaml-output}}"
from_yaml()
Parse YAML strings into Python data structures:
dyngle:
operations:
parse-yaml:
expressions:
parsed: "from_yaml(yaml_string)"
db-name: "get('parsed')['database']"
steps:
- 'echo "Database: {{db-name}}"'
Python Features
Expressions support a subset of Python's features:
Built-in Types and Functions
str(),int(),float(),bool(),len(), etc.
Standard Library Modules
- datetime - Date and time operations (
datetime.now(),datetime.date(), etc.) - math - Mathematical functions (
math.pi,math.sqrt(), etc.) - re - Regular expression operations (
re.match(),re.search(),re.sub(), etc.) - PurePath() - Path manipulation operations (no file I/O)
- json - JSON serialization via
to_json()andfrom_json()functions - yaml - YAML serialization via
to_yaml()andfrom_yaml()functions - os - Environment and system operations via
getenv()andgetcwd()functions
Data Structures
- Lists, dictionaries, tuples
- List comprehensions
- Dictionary comprehensions
Operators
- Arithmetic:
+,-,*,/,//,%,** - Comparison:
==,!=,<,>,<=,>= - Logical:
and,or,not - String: concatenation, formatting
Expression Examples
String Manipulation
dyngle:
expressions:
uppercase-name: "name.upper()"
initials: "'.'.join([word[0] for word in name.split()])"
Mathematical Operations
dyngle:
expressions:
circle-area: "math.pi * radius ** 2"
rounded: "round(get('circle-area'), 2)"
Date and Time
dyngle:
expressions:
now: "datetime.now()"
today: "get('now').date()"
formatted-date: "dtformat(get('now'), '%B %d, %Y')"
List Operations
dyngle:
values:
numbers: [1, 2, 3, 4, 5]
expressions:
doubled: "[n * 2 for n in get('numbers')]"
sum-numbers: "sum(get('numbers'))"
max-number: "max(get('numbers'))"
Conditional Logic
dyngle:
expressions:
environment: "get('env') if get('env') else 'development'"
log-level: "'DEBUG' if get('environment') == 'development' else 'INFO'"
Nested Structure Syntax
Constants and expressions can contain YAML structures. This allows defining hierarchical data that can be referenced using context paths.
For Constants
Use nested YAML structures directly:
dyngle:
constants:
server:
host: api.example.com
port: 443
ssl: true
database:
name: mydb
connection:
- host: db.example.com
- port: 5432
operations:
connect:
steps:
- echo "Connecting to {{server.host}}:{{server.port}}"
- 'echo "Database: {{database.name}}"'
For Expressions
Strings within YAML structures can be referenced and evaluated as separate expression:
dyngle:
expressions:
configure:
server:
host: "format('{{server-host}}')"
port: "int(get('server-port'))"
database:
name: "format('{{db-name}}')"
connection:
- "format('{{db-host}}')"
- "int(get('db-port'))"
Important Notes
- Each string in an expression structure is evaluated as Python code
- Numbers, booleans, and None pass through unchanged in both constants and expressions
- For string literals in expressions, use Python string syntax:
"'literal string'" - Access nested properties using context paths:
{{config.server.host}} - Access array elements in expressions using Python brackets:
get('coordinates')[get('location-index')]
Mixed Example
Combining constants and expressions with nested structures:
dyngle:
constants:
defaults:
timeout: 30
retries: 3
expressions:
runtime:
timestamp: "datetime.now()"
timeout: "get('defaults'.timeout') * 2"
config:
retry-count: "get('defaults.retries')"
enabled: "True"
operations:
process:
steps:
- 'echo "Timeout: {{runtime.timeout}}"'
- 'echo "Retries: {{runtime.config.retry-count}}"'
Context paths
Both constants and expressions with nested structures can be accessed using context paths in:
- Templates:
{{config.server.host}} - Expressions:
get('config.server.host')or via variables if hyphen-free returns::config.server.host
dyngle:
constants:
api:
endpoint: https://api.example.com
version: v1
operations:
call-api:
returns: api.endpoint
expressions:
full-url: "get('api.endpoint') + '/' + get('api.version')"
steps:
- echo "Calling {{full-url}}"
Next Steps
Sub-operations
Operations can call other operations as steps, enabling composability and code reuse. Sub-operation steps are the second type of operation step (alongside command steps).
Basic Usage
Use the sub: key to call another operation:
dyngle:
operations:
greet:
- echo "Hello!"
greet-twice:
steps:
- sub: greet
- sub: greet
Passing Data with send:
To pass data to a sub-operation, use the send: attribute:
dyngle:
operations:
greet-person:
steps:
- echo "Hello, {{name}}!"
main:
constants:
user:
name: Alice
steps:
- sub: greet-person
send: user # Pass data to child
The data's keys and values become inputs in the sub-operation's context.
Template Structures in send:
The send: attribute can accept template structures, allowing you to construct data inline:
dyngle:
operations:
deploy:
steps:
- echo "Deploying {{username}} to {{environment}}"
main:
constants:
user: alice
env: production
steps:
- sub: deploy
send:
username: $user
environment: $env
timestamp: "{{runtime.timestamp}}"
You can use either $variable or "{{variable}}" syntax for template references.
This is equivalent to defining a constant and passing it, but more concise:
# Equivalent to:
main:
constants:
user: alice
env: production
deploy-data:
username: "{{user}}"
environment: "{{env}}"
timestamp: "{{runtime.timestamp}}"
steps:
- sub: deploy
send: deploy-data
Template structures support:
- Nested properties: Reference values like
{{config.host}} - Non-string constants: Include integers, booleans, and floats directly
- Nested structures: Create complex nested dictionaries
Example with nested structure:
dyngle:
operations:
configure:
steps:
- echo "Server: {{server.host}}:{{server.port}}"
main:
constants:
host: example.com
port: 8080
steps:
- sub: configure
send:
server:
host: "{{host}}"
port: "{{port}}"
enabled: true
Capturing Results with receive:
When a sub-operation has a returns: key, capture its value with receive::
dyngle:
operations:
get-version:
returns: ver
steps:
- cat package.json -> jq -r '.version' => ver
tag-release:
steps:
- sub: get-version
receive: version # Capture return value
- git tag "v{{version}}"
If the sub-operation has no returns: key, receive: stores None.
send: and receive: Together
Combine both to create function-like operations:
dyngle:
operations:
double:
accepts:
num: { type: integer }
returns: result
expressions:
result: "num * 2"
steps:
- echo "Doubling {{num}}"
main:
constants:
params:
num: 5
steps:
- sub: double
send: params # Send input
receive: doubled # Capture output
- echo "Result: {{doubled}}"
Similarity to Command Steps
Note the similarity between the send: and receive: attributes in sub-operation steps and the -> (send) and => (receive) operators in command steps:
Command step:
- input-data -> command => output-data
Sub-operation step:
- sub: operation-name
send: input-data
receive: output-data
Both follow the same pattern: data flows in via send/->, gets processed, and flows out via receive/=>.
Input Validation with accepts:
Define what data an operation accepts using accepts::
dyngle:
operations:
process-user:
accepts:
user-id: { type: string }
email: { type: string }
steps:
- echo "Processing {{user-id}}: {{email}}"
main:
constants:
user-data:
user-id: "12345"
email: [email protected]
steps:
- sub: process-user
send: user-data # Validated before execution
If the sent data doesn't match the accepts: schema, the operation fails with a clear error message. See Inputs and interfaces for complete details.
Operation Context Scope
Sub-operations are isolated by default - they do not automatically see the parent operation's context. This isolation makes operations predictable and testable - they behave like pure functions.
Isolation by Default
dyngle:
operations:
child:
steps:
- echo "{{parent-val}}" # ERROR: parent-val not found
parent:
steps:
- echo "secret" => parent-val
- sub: child # child cannot see parent-val
To share data between parent and child, use explicit send: and receive: attributes.
Constants and Expressions
Both declared constants (constants:/expressions:) and variables (=> assignments) are operation-local due to isolation:
Constants and expressions - Each operation sees only its own declarations plus globals:
dyngle:
constants:
global-val: Available to all
operations:
child:
constants:
local-val: Child only
steps:
- echo "{{global-val}}" # OK - global
- echo "{{local-val}}" # OK - local to child
parent:
constants:
parent-val: Parent only
steps:
- echo "{{global-val}}" # OK - global
- echo "{{parent-val}}" # OK - local to parent
- sub: child
# child cannot see parent-val
Variables - Each operation maintains its own variables; => assignments don't cross boundaries:
dyngle:
operations:
child:
steps:
- echo "child-result" => data
- echo "Child data: {{data}}"
parent:
steps:
- echo "parent-result" => data
- echo "Parent data: {{data}}" # "parent-result"
- sub: child
- echo "After child: {{data}}" # Still "parent-result" (child's data isolated)
Complete Isolation Example
dyngle:
constants:
declared-val: global
operations:
child:
constants:
declared-val: child-local
steps:
- echo "{{declared-val}}" # "child-local" (own declaration)
- echo "result" => live-data
parent:
steps:
- echo "{{declared-val}}" # "global" (no local override)
- echo "parent" => live-data
- sub: child
- echo "{{declared-val}}" # Still "global"
- echo "{{live-data}}" # Still "parent" (child's data isolated)
Data Sharing
Use send: and receive: for explicit parent-child data flow:
dyngle:
operations:
child:
returns: result
steps:
- echo "Processing {{input-value}}"
- echo "done" => result
parent:
constants:
data:
input-value: hello
steps:
- sub: child
send: data # Explicitly share data
receive: output # Explicitly capture result
- echo "Got: {{output}}"
Use Cases
Build Pipeline
dyngle:
operations:
install-deps:
- npm install
compile:
- npm run build
test:
- npm test
build:
description: Full build pipeline
steps:
- sub: install-deps
- sub: compile
- sub: test
Reusable Components with Private Operations
dyngle:
operations:
setup-env:
access: private
steps:
- echo "Setting up environment..."
- export NODE_ENV=production
deploy-frontend:
description: Deploy frontend application
steps:
- sub: setup-env
- npm run deploy:frontend
deploy-backend:
description: Deploy backend services
steps:
- sub: setup-env
- npm run deploy:backend
Data Processing Pipeline
dyngle:
operations:
fetch-data:
returns: raw
steps:
- curl -s "https://api.example.com/data" => raw
transform-data:
accepts:
input: { type: string }
returns: output
steps:
- input -> jq '.items' => output
process-all:
returns: final
steps:
- sub: fetch-data
receive: data
- sub: transform-data
send: payload
receive: final
constants:
payload:
input: "{{data}}"
Helper Operations for Secrets
Prevent accidental exposure of operations that handle secrets:
dyngle:
operations:
get-api-token:
access: private
returns: token
steps:
- aws secretsmanager get-secret-value --secret-id api-token => secret
- secret -> jq -r '.SecretString' => token
call-api:
description: Make authenticated API call
steps:
- sub: get-api-token
receive: token
- curl -H "Authorization: Bearer {{token}}" https://api.example.com/data
This prevents running dyngle run get-api-token accidentally or exposing it through the MCP server.
Multi-Step Workflows with Composition
Build complex workflows from smaller private operations:
dyngle:
operations:
install-dependencies:
access: private
- npm install
run-tests:
access: private
- npm test
build-artifacts:
access: private
- npm run build
upload-artifacts:
access: private
- aws s3 sync ./dist s3://my-bucket/
ci-pipeline:
description: Run full CI/CD pipeline
steps:
- sub: install-dependencies
- sub: run-tests
- sub: build-artifacts
- sub: upload-artifacts
Users only see and can run ci-pipeline, not the internal helpers.
Best Practices
Use accepts: for Clear Contracts
Define what data your operations need:
dyngle:
operations:
deploy-service:
accepts:
service-name: { type: string }
version: { type: string }
environment: { type: string }
steps:
- echo "Deploying {{service-name}} v{{version}} to {{environment}}"
This serves as self-documentation and catches errors early.
Explicit is Better Than Implicit
Always use send: and receive: for data flow:
Good:
dyngle:
operations:
get-version:
returns: version
steps:
- cat package.json -> jq -r '.version' => version
tag-release:
steps:
- sub: get-version
receive: ver
- git tag "v{{ver}}"
Avoid (this won't work due to isolation):
dyngle:
operations:
get-version:
steps:
- cat package.json -> jq -r '.version' => version
tag-release:
steps:
- sub: get-version
- git tag "v{{version}}" # ERROR: version not found
Use Private Operations for Helpers
Mark helper operations as private to prevent direct execution:
dyngle:
operations:
deploy:
description: Deploy the application
steps:
- sub: validate
- sub: build
- sub: upload
validate:
access: private
steps:
- echo "Validating configuration..."
build:
access: private
steps:
- npm run build
upload:
access: private
steps:
- aws s3 sync ./dist s3://my-bucket/
Make Public Operations User-Facing
Public operations should represent complete, user-facing actions, while private operations are focused, reusable components.
Next Steps
Output Modes
Operations behave differently depending on whether they have a returns: key. This determines how output is handled and displayed.
The Two Modes
Script Mode (without returns:)
Operations without returns: behave like shell scripts - all command stdout goes to stdout:
dyngle:
operations:
build:
- echo "Starting build..."
- npm install
- npm run build
- echo "Build complete!"
All output is visible, making these ideal for build, deploy, and other workflow tasks where you want to see progress.
Function Mode (with returns:)
Operations with returns: behave like functions - command stdout is suppressed, and only the return value is provided as output:
dyngle:
operations:
get-temperature:
returns: temp
steps:
- echo "Fetching weather..." # stdout suppressed
- curl -s "https://api.example.com/weather" => weather-data
- weather-data -> jq -r '.temperature' => temp
Only the return value is output, making these ideal for data queries and transformations.
Important:
- stderr is always displayed in both modes
- The
=>operator works in both modes (capturing stdout to a variable)
Output Destinations
Where the output goes depends on how the operation is invoked.
In the run Command
When running an operation with dyngle run, output goes to stdout.
Script mode - All command stdout is displayed:
dyngle run build
Output:
Starting build...
[npm install output...]
[npm run build output...]
Build complete!
Function mode - Only the return value is displayed:
dyngle run get-temperature
Output:
72
Output formatting depends on the return value type:
- Simple types (strings, numbers, booleans) - Printed as-is
- Dictionaries and lists - Formatted as YAML
Example with structured data:
dyngle:
operations:
get-user:
returns: user
steps:
- curl -s "https://api.example.com/user/123" => user
dyngle run get-user
Output:
name: Alice Smith
email: [email protected]
role: admin
In Sub-operations
When an operation is called as a sub-operation, its return value is captured by the parent's receive: attribute:
dyngle:
operations:
get-version:
returns: version
steps:
- cat package.json -> jq -r '.version' => version
tag-release:
steps:
- sub: get-version
receive: ver
- git tag "v{{ver}}"
- git push origin "v{{ver}}"
The return value becomes a variable in the parent operation's context.
If no returns: is specified, receive: captures None.
In the MCP Server
When operations are exposed via the MCP server, they return JSON responses.
Success with return value:
dyngle:
operations:
get-weather:
description: Get current weather for a city
returns: weather-info
accepts:
city: { type: string }
steps:
- curl -s "https://api.example.com/weather?city={{city}}" => weather-info
Response:
{
"result": {
"temperature": 72,
"conditions": "Sunny",
"humidity": 65
}
}
Success without return value:
dyngle:
operations:
run-backup:
description: Run the nightly backup process
steps:
- /usr/local/bin/backup.sh
Response:
{
"result": null
}
Error:
{
"error": "Operation failed: command not found"
}
This makes operations with return values particularly useful as AI assistant tools - they can return structured data that assistants can incorporate into responses.
Specifying Return Values
Use the returns: key to specify what value to return:
dyngle:
operations:
get-temperature:
returns: temp
steps:
- curl -s "https://api.example.com/weather" => weather-data
- weather-data -> jq -r '.temperature' => temp
The returns: key can reference anything in the operation context using context paths:
- Variables (set via
=>orreceive:) - Constants (from
constants:) - Expressions (from
expressions:) - Inputs (from stdin,
send:, or MCP) - Nested properties (using dot-separated context paths)
Examples:
dyngle:
operations:
# Return a variable
fetch-data:
returns: data
steps:
- curl -s "https://api.example.com" => data
# Return a constant
get-env:
returns: environment
constants:
environment: production
# Return an expression
get-timestamp:
returns: timestamp
expressions:
timestamp: "dtformat(datetime.now(), '%Y-%m-%d %H:%M:%S')"
# Return nested property
get-host:
returns: config.server.host
constants:
config:
server:
host: api.example.com
port: 443
Template Structures in returns:
The returns: key can also accept template structures, allowing you to construct return values inline:
dyngle:
operations:
get-build-info:
returns:
version: $ver
commit: $sha
timestamp: "{{runtime.timestamp}}"
steps:
- cat version.txt => ver
- git rev-parse HEAD => sha
You can use either $variable or "{{variable}}" syntax for template references.
This returns a dictionary with the specified structure. Template structures support:
- String templates: Use
{{...}}syntax to reference context values - Non-string constants: Include integers, booleans, and floats directly
- Nested structures: Create complex nested dictionaries and lists
- Lists: Return arrays of values
Example with nested structure:
dyngle:
operations:
get-config:
returns:
app:
name: $app-name
version: $version
server:
host: $host
port: 8080
enabled: true
constants:
app-name: MyApp
version: 1.0.0
host: example.com
steps:
- echo "Gathering configuration..."
Example with list:
dyngle:
operations:
get-tags:
returns:
- "{{app}}-{{version}}"
- "{{app}}-latest"
- "production"
constants:
app: myapp
version: 2.0
steps:
- echo "Building tag list..."
This is equivalent to using expressions but more concise for structured data:
# Equivalent to:
get-build-info:
expressions:
result:
version: "{{ver}}"
commit: "{{sha}}"
timestamp: "{{runtime.timestamp}}"
returns: result
steps:
- cat version.txt => ver
- git rev-parse HEAD => sha
Display Control
The --display option controls whether step commands are shown before execution, independently of output mode:
dyngle run build --display none
This suppresses step display but doesn't affect the mode (script vs function) or stdout handling.
See CLI Commands for details on the --display option.
Examples
Simple String Return
dyngle:
operations:
get-version:
returns: version
steps:
- cat package.json -> jq -r '.version' => version
dyngle run get-version
Output:
1.2.3
Structured Data Return
dyngle:
operations:
system-info:
returns: info
expressions:
info:
hostname: "get('runtime.args.0') or 'localhost'"
timestamp: "datetime.now()"
user: "'admin'"
dyngle run system-info myserver
Output:
hostname: myserver
timestamp: 2024-12-14 22:00:00
user: admin
Computed Return Value
dyngle:
operations:
calculate-total:
returns: total
steps:
- curl -s "https://api.example.com/items" => items
- items -> jq '[.[] | .price] | add' => total
Workflow with No Return
dyngle:
operations:
deploy:
description: Deploy to production
steps:
- echo "Starting deployment..."
- sub: build
- aws s3 sync ./dist s3://my-bucket/
- echo "Deployment complete!"
All output is visible during execution.
Best Practices
Use Function Mode for Data Operations
Operations that query or transform data should use returns::
dyngle:
operations:
get-status:
description: Check deployment status
returns: status
steps:
- curl -s "https://api.example.com/status" => status
Use Script Mode for Workflows
Operations that perform tasks should omit returns::
dyngle:
operations:
deploy:
description: Deploy application
steps:
- sub: build
- sub: test
- sub: upload
- echo "Deployment complete!"
Return Structured Data for MCP
Operations exposed via MCP should return meaningful structured data:
dyngle:
operations:
check-health:
description: Check application health
returns: health
steps:
- curl -s "https://api.example.com/health" => health
This allows AI assistants to understand and use the returned information.
Next Steps
MCP Server
Dyngle can run as an MCP (Model Context Protocol) server, exposing operations as tools that AI assistants like Claude can execute.
What is MCP?
The Model Context Protocol (MCP) is a standardized protocol that allows AI assistants to discover and use external tools. When Dyngle runs as an MCP server, your configured operations become tools that AI assistants can call to perform tasks.
Starting the Server
Use the mcp command with the --config option to specify a configuration file:
dyngle --config ./.dyngle.mcp.yml mcp
This starts a server using the stdio (standard input/output) transport, which is ideal for integration with Claude Desktop and other AI assistants that support local MCP servers.
Recommended convention: Use .dyngle.mcp.yml as your MCP-specific configuration file to separate MCP-exposed operations from your regular workflow operations.
Filtering Operations
Use the --operations option to selectively expose only specific operations as MCP tools:
dyngle --config ./.dyngle.mcp.yml mcp --operations op1,op2,op3
This is useful when:
- You have many operations but only want to expose a subset via MCP
- You want different MCP server instances exposing different operations
- You need to restrict which operations are available to AI assistants
Example: Expose only read-only operations:
dyngle --config ./.dyngle.mcp.yml mcp --operations get-status,list-items,read-log
The option accepts a comma-separated list of operation keys. Whitespace around commas is ignored. If any operation key is not found in the configuration, the command will fail with an error before starting the server.
If --operations is not specified, all public operations are exposed (the default behavior).
Passing Arguments to Operations
You can pass positional arguments when starting the MCP server. These arguments are available to all operations throughout the session:
dyngle --config ./.dyngle.mcp.yml mcp arg1 arg2 arg3
Operations can access these arguments via the args list in expressions:
dyngle:
expressions:
first_arg: "args[0]"
second_arg: "args[1]"
operations:
process-with-args:
description: Process data using session arguments
accepts:
input: { type: string }
returns: result
steps:
- echo "Processing {{input}} with {{first_arg}} and {{second_arg}}" => result
Use Cases
Environment-specific configuration:
# Development
dyngle --config ./.dyngle.mcp.yml mcp dev localhost:3000
# Production
dyngle --config ./.dyngle.mcp.yml mcp prod api.example.com
Shared credentials or tokens:
dyngle --config ./.dyngle.mcp.yml mcp $API_TOKEN
Claude Desktop Configuration with Arguments
{
"mcpServers": {
"myapp-dev": {
"command": "dyngle",
"args": ["--config", "/path/to/.dyngle.mcp.yml", "mcp", "dev", "localhost:3000"]
},
"myapp-prod": {
"command": "dyngle",
"args": ["--config", "/path/to/.dyngle.mcp.yml", "mcp", "prod", "api.example.com"]
}
}
}
This allows you to run multiple instances of the same MCP server with different configurations.
How Operations Become Tools
When the MCP server starts:
- Each public operation becomes an MCP tool
- Private operations are not exposed
- Tool input parameters depend on the operation's
accepts:definition:- With accepts: The accept fields become the tool's input parameters
- Without accepts: The tool has no input parameters
Tool Response Format
Tools return JSON responses:
Success:
{"result": <value>}
Where <value> is the operation's return value (if specified), or null if no return value.
Failure:
{"error": "<message>"}
Example: Operation with Interface
dyngle:
operations:
get-weather:
description: Get current weather for a city
accepts:
city:
type: string
returns: weather-info
steps:
- curl -s "https://api.example.com/weather?city={{city}}" => weather-info
An AI assistant can call this tool with the interface parameters:
{
"tool": "get-weather",
"city": "San Francisco"
}
And receive:
{
"result": {
"temperature": 72,
"conditions": "Sunny",
"humidity": 65
}
}
Example: Operation without Interface
dyngle:
operations:
run-backup:
description: Run the nightly backup process
returns: result
steps:
- /usr/local/bin/backup.sh => result
An AI assistant can call this tool with no parameters:
{
"tool": "run-backup"
}
Configuring Claude Desktop
To use Dyngle operations with Claude Desktop, configure the MCP server in Claude's configuration file.
macOS Configuration
Edit or create ~/Library/Application Support/Claude/claude_desktop_config.json:
{
"mcpServers": {
"dyngle": {
"command": "dyngle",
"args": ["--config", "/absolute/path/to/your/project/.dyngle.mcp.yml", "mcp"]
}
}
}
Important:
- Use absolute paths for the configuration file
- Restart Claude Desktop completely after editing (not just close the window)
- Tools appear in Claude's "Search and tools" interface
Example with Project-Specific Config
For a project at /Users/alice/projects/myapp:
{
"mcpServers": {
"myapp": {
"command": "dyngle",
"args": ["--config", "/Users/alice/projects/myapp/.dyngle.mcp.yml", "mcp"]
}
}
}
Configuration File Search
If you don't specify a --config option, Dyngle follows the standard configuration file search order. See Configuration for details. However, it's recommended to always use --config with MCP to ensure the correct operations are exposed.
Design Considerations
Use Descriptions and Interfaces
Operations exposed via MCP should have clear descriptions and, where appropriate, explicit interface definitions:
dyngle:
operations:
deploy-app:
description: Deploy the application to a specified environment
accepts:
environment:
type: string
version:
type: string
steps:
- sub: build
- aws s3 sync ./dist s3://{{environment}}-bucket/{{version}}/
The description helps AI assistants understand when to use the tool, and the interface makes the expected inputs explicit. See Operations for full interface syntax details.
Return Values and Interfaces
Operations used as tools should return meaningful values and use interfaces to define expected inputs:
dyngle:
operations:
check-status:
description: Check deployment status
accepts:
service:
type: string
returns: status
steps:
- curl -s "https://api.example.com/status?service={{service}}" => status
This allows AI assistants to understand what inputs are needed and incorporate the result into their responses.
Private Operations for Secrets
Use private operations to protect sensitive operations:
dyngle:
operations:
get-credentials:
access: private
returns: creds
steps:
- aws secretsmanager get-secret-value --secret-id api-creds => creds
make-api-call:
description: Call the API with authentication
steps:
- sub: get-credentials
receive: creds
- curl -H "Authorization: {{creds}}" https://api.example.com/data
The get-credentials operation won't be exposed to AI assistants.
Example Configuration
Create a .dyngle.mcp.yml file with operations designed for AI assistant use:
dyngle:
operations:
# Development workflow
run-tests:
description: Run the test suite for a specific module
accepts:
module: { type: string }
returns: test-results
steps:
- pytest {{module}} --json-report => results
- results -> jq '.summary' => test-results
# Information queries
get-version:
description: Get version from package.json
accepts:
package: { type: string }
returns: version
steps:
- cat {{package}}/package.json => pkg
- pkg -> jq -r '.version' => version
# System operations
check-service:
description: Check the status of a specific application service
accepts:
service: { type: string }
returns: status
steps:
- systemctl status {{service}} => output
- echo "Service {{service}} is running" => status
See Use Cases for more examples of operations suitable for MCP integration.
Troubleshooting
Server Not Showing Up
- Check JSON syntax - Validate
claude_desktop_config.json - Verify Dyngle in PATH - Run
which dyngle(macOS/Linux) orwhere dyngle(Windows) - Use full path - Try the full path to the dyngle executable in the
commandfield - Restart Claude Desktop - Use Cmd+Q (macOS) or quit from system tray (Windows)
Checking Logs (macOS)
Claude Desktop writes logs to ~/Library/Logs/Claude/:
tail -n 20 -f ~/Library/Logs/Claude/mcp*.log
Next Steps
Configuration
Dyngle reads configuration from YAML files that define operations, expressions, constants, and other settings.
Configuration File Location
Dyngle searches for configuration files in the following order (first match wins):
- Command line option:
--configparameter - Environment variable:
DYNGLE_CONFIG - Current directory:
.dyngle.yml - Home directory:
~/.dyngle.yml
Examples
Using a specific config file:
dyngle --config /path/to/config.yml run my-operation
Using an environment variable:
export DYNGLE_CONFIG=/path/to/config.yml
dyngle run my-operation
Configuration Structure
A basic configuration file has this structure:
dyngle:
operations:
# Your operations here
expressions:
# Global expressions (optional)
constants:
# Global constants (optional)
Imports
Configuration files can import other configuration files, allowing you to organize your operations across multiple files and share common configurations.
Basic Import
Import other configuration files using the imports: key:
dyngle:
imports:
- ~/.dyngle.yml
- ./common-operations.yml
operations:
# Operations defined here
Import Behavior
- Imports are loaded in the order specified
- Later imports override earlier ones in case of name conflicts
- Operations and expressions in the main file override all imports
- Imports are recursive - imported files can import other files
Use Cases
User-level configuration:
Create a ~/.dyngle.yml with common operations, then import it in project-specific configs:
# ~/.dyngle.yml
dyngle:
operations:
cleanup:
- rm -rf .venv
- rm -rf node_modules
# project/.dyngle.yml
dyngle:
imports:
- ~/.dyngle.yml
operations:
build:
- npm install
- npm run build
Shared team operations:
# team-shared.yml
dyngle:
operations:
deploy-staging:
- sub: build
- aws s3 sync ./dist s3://staging-bucket/
# developer's local config
dyngle:
imports:
- ./team-shared.yml
operations:
dev:
- npm run dev
Next Steps
CLI Commands
Dyngle provides several commands for working with operations.
run
Execute a named operation from your configuration.
Syntax
dyngle run <operation-name> [optional arguments...]
Examples
Basic execution:
dyngle run hello
Pass command-line arguments that become available throughout the entire operation tree:
dyngle run greet Alice Bob
The run command is unique in supporting positional command-line arguments, as a convenience for common desktop operations. They can be accessed using the runtime.args context path.
dyngle:
operations:
greet:
expressions:
name: "get('runtime.args.0') or 'World'"
steps:
- echo "Hello {{name}}!"
With data from stdin:
echo "name: Alice" | dyngle run hello
With a specific config file:
dyngle --config /path/to/.dyngle.yml run deploy
Options
--display <mode>
Control step display behavior:
steps(default) - Show each step before executingnone- Suppress step display for cleaner output
dyngle run build --display none
When to Use Each Mode
Use steps mode when:
- Debugging operations - See exactly what commands are being executed
- Learning - Understand what's happening during execution
- Development - Verify that template substitution is working correctly
- Interactive use - Get visual confirmation of progress
Use none mode when:
- Scripting - Cleaner output for parsing or processing
- Production workflows - Reduce noise in logs
- Return value focused - When you only care about the final result
- Automated systems - CI/CD environments where step display is unnecessary
Examples
Development workflow with step display:
dyngle run test --display steps
Useful for seeing exactly what test commands are being run.
Production deployment with clean output:
dyngle run deploy --display none
Keeps deployment logs focused on command output without displaying each step.
Combining with return values:
When an operation has a return value, none mode is particularly useful:
dyngle:
operations:
get-version:
returns: version
steps:
- cat package.json => pkg
- pkg -> jq -r '.version' => version
With steps mode:
$ dyngle run get-version --display steps
Output:
$ cat package.json => pkg
$ pkg -> jq -r '.version' => version
1.2.3
With none mode:
$ dyngle run get-version --display none
Output:
1.2.3
Important Notes
- The
--displayoption controls whether step commands are shown before execution - This works independently of whether the operation has a
returns:key (which controls stdout suppression) - stderr is always displayed regardless of the
--displaysetting - If you don't specify
--display, it defaults tostepsmode
list-operations
List all available public operations with their descriptions.
Syntax
dyngle list-operations
Output Format
The command outputs YAML-formatted list of operations:
operations:
build: Build the project for production
test: Run the test suite
deploy: Deploy to production
Behavior
- Shows only public operations (not those with
access: private) - Includes the
description:attribute if present - Operations without descriptions show empty descriptions
See Operations for information about public vs private operations.
mcp
Start Dyngle as an MCP (Model Context Protocol) server, exposing operations as tools for AI assistants.
Syntax
dyngle mcp
This starts a server using the stdio (standard input/output) transport, which is suitable for integration with Claude Desktop and other AI assistants.
Configuration File
Specify a configuration file for the MCP server:
dyngle --config /path/to/.dyngle.yml mcp
See MCP Server for complete setup and usage information.
Global Options
These options work with any command:
--config <path>
Specify a configuration file:
dyngle --config ./custom.yml run hello
Next Steps
Use Cases
Real-world examples demonstrating how to use Dyngle for various tasks. These examples show practical patterns for building workflows, processing data, and integrating with AI assistants.
Development Workflows
Build Pipeline
Compose multiple build steps into a single operation:
dyngle:
operations:
install-deps:
access: private
- npm install
compile:
access: private
- npm run build
test:
access: private
- npm test
build:
description: Full build pipeline
steps:
- sub: install-deps
- sub: compile
- sub: test
Users run dyngle run build to execute the complete pipeline, while individual steps remain private.
Testing Module
Expose a test operation via MCP for AI assistants to run tests:
dyngle:
operations:
run-tests:
description: Run the test suite for a specific module
accepts:
module: { type: string }
returns: test-results
steps:
- pytest {{module}} --json-report => results
- results -> jq '.summary' => test-results
An AI assistant can run tests and understand the results to help debug issues.
Continuous Integration
Build a complete CI/CD pipeline:
dyngle:
operations:
install-dependencies:
access: private
- npm install
run-tests:
access: private
- npm test
build-artifacts:
access: private
- npm run build
upload-artifacts:
access: private
- aws s3 sync ./dist s3://my-bucket/
ci-pipeline:
description: Run full CI/CD pipeline
steps:
- sub: install-dependencies
- sub: run-tests
- sub: build-artifacts
- sub: upload-artifacts
Version Management
Query and manage versions across a project:
dyngle:
operations:
get-version:
description: Get version from package.json
returns: version
steps:
- cat package.json -> jq -r '.version' => version
tag-release:
description: Tag a release with the current version
steps:
- sub: get-version
receive: ver
- git tag "v{{ver}}"
- git push origin "v{{ver}}"
Data Processing
API Data Pipeline
Fetch and process data from an API:
dyngle:
operations:
get-user-emails:
description: Extract email addresses from user API
returns: emails
steps:
- curl -s "https://api.example.com/users" => users
- users -> jq -r '.[].email' => emails
Log Analysis
Process log files to extract insights:
dyngle:
operations:
analyze-logs:
description: Count errors in today's logs
returns: summary
steps:
- curl -s "https://logs.example.com/today" => logs
- logs -> grep "ERROR" => errors
- errors -> wc -l => error-count
- echo "Found {{error-count}} errors" => summary
Data Transformation
Transform JSON data through multiple stages:
dyngle:
operations:
fetch-data:
access: private
returns: raw
steps:
- curl -s "https://api.example.com/data" => raw
transform-data:
access: private
accepts:
input: { type: string }
returns: output
steps:
- input -> jq '.items | map({id, name})' => output
process-all:
description: Fetch and transform data
returns: final
steps:
- sub: fetch-data
receive: data
- sub: transform-data
send: payload
receive: final
constants:
payload:
input: "{{data}}"
Operations and Deployment
Environment Deployment
Deploy to different environments with reusable setup:
dyngle:
operations:
setup-env:
access: private
steps:
- echo "Setting up environment..."
- export NODE_ENV=production
deploy-frontend:
description: Deploy frontend application
steps:
- sub: setup-env
- npm run deploy:frontend
deploy-backend:
description: Deploy backend services
steps:
- sub: setup-env
- npm run deploy:backend
Multi-Environment Deployment
Deploy with environment-specific configuration:
dyngle:
operations:
deploy:
description: Deploy application to specified environment
accepts:
environment:
type: string
version:
type: string
steps:
- sub: build
- aws s3 sync ./dist s3://{{environment}}-bucket/{{version}}/
- echo "Deployed v{{environment}} to {{environment}}"
build:
access: private
- npm run build
Service Health Check
Check service status and return structured information:
dyngle:
operations:
check-service:
description: Check the status of a specific application service
accepts:
service: { type: string }
returns: status
steps:
- systemctl status {{service}} => output
- echo "Service {{service}} is running" => status
Perfect for MCP integration - AI assistants can check service health.
Secret Management
Secure Credential Access
Protect operations that handle secrets:
dyngle:
operations:
get-api-token:
access: private
returns: token
steps:
- aws secretsmanager get-secret-value --secret-id api-token => secret
- secret -> jq -r '.SecretString' => token
call-api:
description: Make authenticated API call
steps:
- sub: get-api-token
receive: token
- curl -H "Authorization: Bearer {{token}}" https://api.example.com/data
The get-api-token operation is private so it can't be run directly or exposed via MCP, preventing accidental token exposure.
Database Credentials
Safely manage database credentials:
dyngle:
operations:
get-db-credentials:
access: private
returns: creds
steps:
- aws secretsmanager get-secret-value --secret-id db-creds => creds
backup-database:
description: Create database backup
steps:
- sub: get-db-credentials
receive: creds
- pg_dump -h {{creds.host}} -U {{creds.user}} > backup.sql
AI Assistant Integration (MCP)
Code Analysis
Expose operations that help AI assistants understand codebases:
dyngle:
operations:
get-package-info:
description: Get package information from package.json
returns: info
steps:
- cat package.json => pkg
- pkg -> jq '{name, version, description, dependencies}' => info
get-dependencies:
description: List all project dependencies
returns: deps
steps:
- cat package.json -> jq '.dependencies | keys' => deps
Project Information
Provide project context to AI assistants:
dyngle:
operations:
project-summary:
description: Get a summary of the project structure
returns: summary
expressions:
summary:
name: "get('pkg')['name']"
version: "get('pkg')['version']"
files: "get('file-count')"
steps:
- cat package.json => pkg
- find . -type f | wc -l => file-count
Development Environment Status
Help AI assistants understand the current development state:
dyngle:
operations:
dev-status:
description: Check development environment status
returns: status
expressions:
status:
git-branch: "get('branch')"
uncommitted-changes: "get('changed-files')"
node-version: "get('node-ver')"
steps:
- git branch --show-current => branch
- git status --short | wc -l => changed-files
- node --version => node-ver
Content Management
Document Processing
Process markdown or text files:
dyngle:
operations:
count-words:
description: Count words in a markdown file
accepts:
file: { type: string }
returns: count
steps:
- cat {{file}} => content
- content -> wc -w => count
extract-headings:
description: Extract headings from markdown
accepts:
file: { type: string }
returns: headings
steps:
- cat {{file}} => content
- content -> grep "^#" => headings
Site Building
Build and deploy static sites:
dyngle:
operations:
build-site:
access: private
- hugo build
deploy-site:
description: Build and deploy the site
steps:
- sub: build-site
- aws s3 sync ./public s3://my-site-bucket/
- aws cloudfront create-invalidation --distribution-id DIST123 --paths "/*"
System Administration
Backup Operations
Automated backup with composition:
dyngle:
operations:
backup-files:
access: private
- tar -czf backup-$(date +%Y%m%d).tar.gz ./data
upload-backup:
access: private
- aws s3 cp backup-*.tar.gz s3://backup-bucket/
cleanup-old:
access: private
- find ./backup-*.tar.gz -mtime +7 -delete
backup:
description: Complete backup workflow
steps:
- sub: backup-files
- sub: upload-backup
- sub: cleanup-old
Service Management
Manage application services:
dyngle:
operations:
restart-app:
description: Restart application services
steps:
- systemctl stop myapp
- sleep 5
- systemctl start myapp
- systemctl status myapp
Best Practices Demonstrated
Composition Over Complexity
Break complex operations into smaller, focused operations:
Good:
dyngle:
operations:
validate:
access: private
- python validate.py
transform:
access: private
- python transform.py
upload:
access: private
- aws s3 sync ./output s3://bucket/
process:
description: Complete processing workflow
steps:
- sub: validate
- sub: transform
- sub: upload
Avoid:
dyngle:
operations:
process:
- python validate.py
- python transform.py
- aws s3 sync ./output s3://bucket/
While the "avoid" version works, the composed version is more testable, maintainable, and allows reuse of individual steps.
Clear Interfaces
Define explicit interfaces for operations:
dyngle:
operations:
deploy-service:
description: Deploy a service to an environment
accepts:
service-name: { type: string }
environment: { type: string }
version: { type: string }
steps:
- echo "Deploying {{service-name}} v{{version}} to {{environment}}"
This serves as documentation and provides validation.
Return Meaningful Data
Operations exposed via MCP should return structured data:
dyngle:
operations:
analyze-project:
description: Analyze project health
returns: analysis
expressions:
analysis:
test-coverage: "get('coverage')"
lint-errors: "get('errors')"
build-status: "get('status')"
steps:
- pytest --cov --json-report => cov-report
- cov-report -> jq '.coverage' => coverage
- eslint . --format json => lint-report
- lint-report -> jq '.[] | .errorCount' => errors
- echo "healthy" => status
Next Steps
Best Practices
Guidelines for writing effective Dyngle operations. These practices help create operations that are reliable, maintainable, and work correctly across different execution contexts.
Operation Design
Use hyphen-separated keys: Always use hyphen-separated strings (like my-variable) for YAML keys in expressions, constants, and operation names. Never use underscore-separated strings (like my_variable). This maintains consistency across your configuration.
Return structured results: Operations should return results as dicts with named fields. Add nested dicts and lists as appropriate to represent the data structure clearly.
Parse string outputs: If a command step produces a string containing structured data, parse it in expressions to return a structured result rather than a raw string.
Accept inputs via schema: Define operation inputs using the accepts: attribute to validate and default input data.
Document field purposes: Add description: to each field in your accepts: schema. Field descriptions are passed through to MCP tools and help users understand parameters in dynamic contexts like APIs or MCP servers.
Output troubleshooting info: Operations may optionally write useful troubleshooting information to stdout during execution.
Expression Usage
Use expressions for logic: Use Dyngle expressions for logic and string or data manipulation. Avoid bash -c and python -c command steps.
Break up long expressions: Avoid long multi-line expressions. Break them into multiple expressions that reference each other using get(), without losing logic or functionality.
Handle missing values: The get() function returns a blank string if the value isn't found. Use the or operator to specify defaults: get('value') or 'default'.
Reference expressions with get(): When one expression references another expression's value, always use get('expression-name'). Direct variable references only work for input values and context data, not for other expressions.
Use YAML objects over Python dicts: When defining structured data in expressions, use YAML object syntax instead of Python dict strings. YAML objects are easier to read, parse, and maintain. For example, instead of result: "{'path': get('abs-path'), 'items': get('items')}", use:
result:
path: get('abs-path')
items: get('items')
Control Flow
Use named expressions for conditions: Always define conditions as named expressions rather than inline. This makes operations more readable and testable. For example:
expressions:
is-production: "environment == 'production'"
steps:
- if: is-production
then:
- echo "Production deployment"
Prefer conditionals over complex expressions: When logic becomes complex, use conditional steps with if/then/else rather than trying to handle all cases in expressions.
Keep conditional blocks focused: Each conditional block should handle a single logical decision. Use nested conditionals for multiple decision points rather than combining logic into complex expressions.
Path Management
Prefer absolute paths: Operations may run in daemons, cron jobs, or other non-interactive contexts. Prefer absolute paths over relative paths for file operations.
Convert relative to absolute: Many command steps interpret path arguments relative to the current working directory. Operations should contain logic to convert relative paths to absolute paths.
Use path functions: Use getcwd() and PurePath() for path manipulation in expressions.
Handle environment variables carefully: Environment variables like HOME can be accessed via getenv() when needed, but operations should not assume they exist in all execution contexts.
Example
This example demonstrates several best practices: accepting inputs with defaults, converting relative paths to absolute, parsing command output into structured data, and returning a well-structured result.
list-directory:
description: List contents of a directory with name, type, and size for each item (includes hidden files)
accepts:
path:
type: string
default: "."
description: Directory path to list (relative or absolute)
expressions:
# Convert relative path to absolute path
is-absolute: "get('path').startswith('/')"
relative-to-abs: "str(PurePath(getcwd()) / get('path'))"
abs-path: "get('path') if get('is-absolute') else get('relative-to-abs')"
# Parse ls output into structured data
# Each line format: "permissions links owner group size month day time name"
lines: "get('raw-listing').strip().split('\\n')"
filtered-lines: "[line for line in get('lines') if line.strip() and not line.startswith('total')]"
# Extract names from each line (last field)
names: "[line.split()[-1] for line in get('filtered-lines')]"
# Determine types (directory or file based on first character)
types: "['directory' if line.startswith('d') else 'file' for line in get('filtered-lines')]"
# Extract sizes (5th field, 0-indexed position 4)
sizes: "[int(line.split()[4]) if line.split()[4].isdigit() else 0 for line in get('filtered-lines')]"
# Combine into list of dicts
items: "[{'name': n, 'type': t, 'size': s} for n, t, s in zip(get('names'), get('types'), get('sizes'))]"
result:
path: get('abs-path')
items: get('items')
steps:
# Use ls -la to get detailed listing including hidden files
# -l: long format with details
# -a: include hidden files (starting with .)
- ls -la {{abs-path}} => raw-listing
returns: result
Note how the example:
- Uses
accepts:with a default value for the path input - Converts relative paths to absolute using expressions
- Uses hyphen-separated keys consistently throughout
- Breaks complex parsing into multiple small expressions
- Returns a structured dict with named fields using YAML object syntax instead of Python dict strings