Operation Lifecycle
Understanding how Dyngle executes operations helps you design effective workflows.
Execution Steps
When you run an operation, Dyngle follows this lifecycle:
1. Load Configuration
Dyngle locates and loads the configuration file using a specific search order.
See Configuration for the complete search order and details.
2. Process Imports
If the configuration contains imports:, Dyngle loads the imported files recursively.
3. Load Input Data
If stdin is not a tty (e.g., data is piped in), Dyngle parses it as YAML and adds it to the data context.
echo "name: Alice" | dyngle run greet
4. Find the Operation
Dyngle looks up the named operation in the configuration. If not found, it returns an error.
5. Check Access Control
For direct execution via dyngle run, Dyngle verifies the operation is public. Private operations can only be called as sub-operations.
6. Initialize Data Context
Dyngle builds the initial data context by merging (in order of precedence):
- Global values from
dyngle: values: - Global expressions from
dyngle: expressions: - Operation-specific values
- Operation-specific expressions
- Input data from stdin
7. Execute Steps
For each step in the operation:
Template Rendering
Dyngle renders templates in the step, replacing {{variable}} syntax with values from the data context.
Parse Step
Dyngle parses the step to identify:
- Data input operator (
->) - Command and arguments
- Data assignment operator (
=>)
Execute Command
Dyngle runs the command using Python's subprocess.run():
- If there's a data input operator, the value is passed to stdin
- stdout and stderr are captured
- The exit code is checked
Update Data Context
If there's a data assignment operator, stdout from the command is stored in the data context with the specified name.
8. Handle Return Value
If the operation has a return: key:
- Dyngle looks up the specified value in the data context
- For
dyngle run, it formats and displays the value - For sub-operations, the value is made available to the parent
- For MCP tools, the value is returned as
{"result": <value>}
Data Context Evolution
The data context changes throughout execution:
dyngle:
values:
base: Initial
operations:
example:
values:
local: Added
expressions:
computed: "'Calculated'"
steps:
- echo "first" => step1 # Adds 'step1' to context
- echo "second" => step2 # Adds 'step2' to context
Data context evolution:
- Start:
{base: "Initial"} - After init:
{base: "Initial", local: "Added", computed: "Calculated"} - After step 1:
{base: "Initial", local: "Added", computed: "Calculated", step1: "first"} - After step 2:
{base: "Initial", local: "Added", computed: "Calculated", step1: "first", step2: "second"}
Sub-operation Lifecycle
When a step calls a sub-operation, scoping rules determine what data is shared between parent and child operations.
See Sub-operations for detailed explanation of declared values vs live data scoping.
Error Handling
Command Failures
If a command returns a non-zero exit code:
- Dyngle stops execution
- stderr is displayed
- The operation returns an error
- For MCP tools, returns
{"error": "<message>"}
Template Errors
If a template references an undefined variable:
- Dyngle raises an error
- Execution stops
- The error message identifies the missing variable
Access Control Errors
If you try to run a private operation directly:
- Dyngle checks the
access:attribute - If private, returns an error immediately
- Execution never starts
Performance Considerations
Command Execution
Each step spawns a new subprocess:
- Minimal overhead for simple commands
- Consider batching operations when possible
Template Rendering
Templates are rendered for each step:
- Efficient for small datasets
- Large data structures are passed by reference in the context
Data Flow
Data captured with => is held in memory:
- Suitable for typical command output
- Be cautious with very large outputs (e.g., large files)