The 'Dynamic Pipeline' Pattern: A Mutable Method Chaining for Real-time Processing
Source: Dev.to
What is the Dynamic Pipeline?
The Dynamic Pipeline is a method‑chaining pattern that lets you add, remove, and update processing steps at runtime while keeping a clean, fluent API.
Processor (Mutable)
+------------------------------------+
| |
Input ((Data)) ---------> | Filter A → Filter B → Filter C | ---------> Output ((Data))
| |
+------------------------------------+
^ ^ ^
| | |
User Settings / Events -------------+---------+---------+
(add / update / remove)
Example usage
// Incrementally add processing steps
const processor = new Processor()
.addFilterA(param1) // + Process A
.addFilterB(param2) // + Process B
.addFilterC(param3); // + Process C
// Update parameters later
processor.updateFilterA(newParam);
// Subtract (remove) a process
processor.removeFilter('B');
Key Characteristics
| Characteristic | Description |
|---|---|
Addition (add) | New processing steps can be appended at any time. |
Subtraction (remove) | Existing steps can be detached dynamically. |
Update (update) | Parameters for specific steps can be modified without rebuilding the whole pipeline. |
| Order‑sensitive | The sequence of addition determines the execution order. |
| Immediate use | No .build() call is required; the processor is always in an executable state. |
A Metaphor: Human Growth and Experience
const person = new Person()
.addEducation('University')
.addSkill('Programming')
.addExperience('Living abroad')
.addTrauma('Major failure');
Just as humans evolve, you “add” abilities and experiences. Life then shapes us further:
// Forgetting a skill (remove)
person.removeSkill('Programming');
// Leveling up through practice (update)
person.updateSkill('Programming', { level: 'expert' });
// Losing something precious (remove)
person.removeExperience('Living abroad');
Depending on the context you can exercise, update, or let go of these attributes.
Comparison with Existing Patterns
| Pattern | Operation | Order relevance | Runtime mutation |
|---|---|---|---|
| Builder | Configure | Irrelevant | Immutable after .build() |
| Decorator | Wrap | Relevant | Difficult to “unwrap” once applied |
| Middleware | Register | Relevant | Usually static after initial registration |
RxJS pipe | Transform | Relevant | Immutable (always returns a new instance) |
| Chain of Responsibility | Link | Relevant | “One handles and stops the chain” |
| Dynamic Pipeline | Add/Rem/Upd | Relevant | Fully mutable |
The advantage of the Dynamic Pipeline is that it balances a structured, ordered pipeline with the flexibility of runtime adjustments.
How I Used It: Stroke Stabilization
In a drawing application, stroke stabilization applies convolution‑based filters (e.g., Gaussian smoothing, Kalman filtering) to raw pointer input. These filters must react to user interaction.
const pointer = new StabilizedPointer()
.addNoiseFilter(1.5)
.addKalmanFilter(0.1)
.addGaussianFilter(5);
// Adjusting settings via the UI
settingsPanel.onChange(settings => {
pointer.updateNoiseFilter(settings.noiseThreshold);
if (!settings.useSmoothing) {
pointer.removeFilter('gaussian');
}
});
// Dynamic adjustments based on pen velocity
pointer.onVelocityChange(velocity => {
if (velocity > 500) {
// Prioritize performance by removing heavy filters during high‑speed motion
pointer.removeFilter('gaussian');
} else {
pointer.addGaussianFilter(5);
}
});
The pattern is not limited to this domain; any situation that needs a mutable, ordered processing chain can benefit.
Possible Implementation Strategies
1. Array‑based Pipeline Management
Maintain the pipeline in a simple array. This naturally preserves the order of addition and makes iteration straightforward.
class Processor {
constructor() {
this.steps = []; // [{ id: 'A', fn: filterA }, …]
}
addFilterA(param) {
this.steps.push({ id: 'A', fn: data => filterA(data, param) });
return this;
}
// …other add/remove/update methods…
execute(input) {
return this.steps.reduce((data, step) => step.fn(data), input);
}
}
2. Identification via Type or Unique ID
Each step can be identified by a type string ('validation', 'transform') or a unique identifier (UUID). This makes remove/update operations O(1) when coupled with a map.
class Processor {
constructor() {
this.steps = []; // ordered list
this.index = new Map(); // id → position in steps
}
addStep(id, fn) {
this.steps.push({ id, fn });
this.index.set(id, this.steps.length - 1);
return this;
}
removeStep(id) {
const pos = this.index.get(id);
if (pos !== undefined) {
this.steps.splice(pos, 1);
this.index.delete(id);
// re‑index subsequent items…
}
return this;
}
// updateStep, execute, etc.
}
3. Immutable‑ish API with Internal Mutability
Expose a fluent, mutable façade while keeping the internal representation immutable for safety (e.g., copy‑on‑write). This gives the ergonomics of mutability without sacrificing thread‑safety.
class Processor {
constructor(steps = []) {
this._steps = steps; // never mutated directly
}
addFilterA(param) {
const newStep = data => filterA(data, param);
return new Processor([...this._steps, { id: 'A', fn: newStep }]);
}
// remove / update return new Processor instances
}
You can then decide whether you want true mutability (as in the first two strategies) or a functional style that still feels mutable to the caller.
Summary
The Dynamic Pipeline pattern gives you:
- Fluent, chainable API – easy to read and write.
- Runtime mutability – add, remove, or update steps on the fly.
- Order preservation – the sequence of calls defines execution order.
- Immediate usability – no separate “build” step is required.
It sits somewhere between the static nature of a Builder and the flexibility of a Middleware/Decorator stack, making it a handy tool for any domain that needs a mutable, ordered processing chain (graphics, data validation, event handling, etc.).
Managing Types with Unique IDs
Using unique identifiers for each type can make the system more robust. Instead of relying on ad‑hoc string comparisons, you can store the identifier alongside the object and look it up when needed.
Caching the Pipeline as a Single Function
For high‑frequency execution scenarios, it can be beneficial to cache the entire pipeline as a single function whenever the configuration changes. Doing so may reduce runtime overhead, although this approach has not been extensively tested in practice.
Final Thoughts
I’m not certain whether this qualifies as a formal “pattern”—it may be closer to an architectural idiom or simply a common‑sense technique that many developers already employ intuitively.
That said, I have found this approach helpful for my specific use case of bridging a clean, readable API with the need for dynamic, real‑time adjustments.
If there is an established name for this approach, or if you see any potential pitfalls in maintaining a mutable pipeline, I would sincerely appreciate your insights in the comments. Thank you for reading.