Here are a few options, from most direct to more story-driven: **Option 1 (Recommended):** Solved: From Dart to TypeScript: ...
Source: Dev.to
š Executive Summary
TL;DR: A Dart developerās streaming JSON parser in TypeScript, using eventāemitter callbacks, is not idiomatic for modern TS. This article guides the transition from callbackābased patterns to Async Iterators (for await ⦠of loops), offering adapter patterns or full refactors for cleaner, more maintainable code.
- Modern TypeScript favors Async Iterators over eventāemitter callbacks (
onValue,onError,onDone) for handling data streams, resulting in flatter, easierātoāread code. - The Adapter Pattern lets you wrap an existing callbackābased streaming parser with an
AsyncGenerator, instantly providing an idiomaticfor await ⦠ofAPI without a full rewrite. - Refactoring to a native
async function*is the most idiomatic solution: you canyieldparsed values directly and use standardtry/catchfor error handling. - A senior engineer breaks down the shift from eventādriven patterns to modern async iterators, guiding a Dart developer on how to write truly āidiomaticā TypeScript for a streaming JSON parser.
A Story Worth Remembering
I recall a 2āÆAM incident call. The auth-service-v2 was melting down, and the onācall junior dev was completely lost. The service was written in Node.js, but it was structured like a Spring Boot applicationācomplete with dependencyāinjection containers and factory patterns that felt alien.
The original author was a Java dev who transplanted their patterns into Node. The code worked, but nobody on the team knew how to debug it. It was a ghost ship in our own fleet.
Thatās the exact feeling I get when I see code thatās technically correct but culturally foreign. Itās not about being āwrongā; itās about being maintainable for the team youāre on.
From Callbacks to Async Iterators
Youāve come from Dart and built a slick streaming parser in TypeScript. You used an eventāemitter style API with onValue, onError, and onDone callbacks. This is a classic, battleātested patternāhow things were done in Node.js for years.
However, the JavaScript/TypeScript world has evolved significantly with the introduction of async/await. The modern, idiomatic way to handle streams of data isnāt through callbacks, but through Async Iterators. They let you treat a stream of data just like an array, using a simple for await ⦠of loop. This makes the code cleaner, easier to reason about, and avoids the nesting that can come with callbacks.
Your API isnāt bad; it just speaks an older dialect. Letās get you fluent in the modern tongue.
Three Ways to Tackle This
1ļøā£ Quick Patch ā Adapter (NonāDestructive)
You donāt always have time for a full rewrite, especially if the core logic is complex. The fastest way to make your existing parser feel more idiomatic is to wrap it. Create a function that instantiates your eventābased parser and returns an AsyncGenerator. This adapter translates the āoldā event style into the ānewā iterator style without touching the core implementation.
// Your existing parser class (simplified)
class StreamingParser {
constructor() { /* ... */ }
write(chunk) { /* ... */ }
on(event, callback) { /* ... */ }
}
// The adapter function
export async function* parseJsonStream(readable) {
const parser = new StreamingParser();
// A little queue to handle backpressure and race conditions
const queue: any[] = [];
let done = false;
let error: any = null;
let resolvePromise = () => {};
parser.on('value', (value) => {
queue.push(value);
resolvePromise();
});
parser.on('error', (err) => {
error = err;
resolvePromise();
});
parser.on('done', () => {
done = true;
resolvePromise();
});
// Pipe the source readable stream into the parser
readable.on('data', (chunk) => parser.write(chunk));
readable.on('end', () => parser.end());
while (!done) {
while (queue.length > 0) {
yield queue.shift();
}
if (error) {
throw error;
}
if (done) break;
// Wait for the next event
await new Promise((resolve) => {
resolvePromise = resolve;
});
}
}
Pro Tip: Ship the adapter immediately to provide an idiomatic API for new consumers, while planning a deeper refactor of the core class for a future release. It keeps everyone happy.
2ļøā£ Full Refactor ā Native Async Generator
The goal is to refactor the parserās internal logic to be a native async function*. This eliminates the need for managing event listeners and state manually. The yield keyword effectively āpausesā your function and hands a value back to the consumer, resuming only when the consumer asks for the next item in the for await ⦠of loop.
Consumer Code Before (Callback Style)
const parser = new StreamingParser();
parser.on('value', (val) => {
console.log('Got a value:', val);
});
parser.on('error', (err) => {
console.error('Oh no:', err);
});
parser.on('done', () => {
console.log('All done!');
});
stream.pipe(parser);
Consumer Code After (Idiomatic Async Iterator)
try {
for await (const value of parseJsonStream(stream)) {
console.log('Got a value:', value);
}
console.log('All done!');
} catch (err) {
console.error('Oh no:', err);
}
The refactored code is flat, uses standard try/catch for error handling, and is much easier to follow.
Core Parser Refactor (Conceptual)
export async function* parseJsonStream(stream) {
let buffer = '';
// ... other state variables ...
for await (const chunk of stream) {
buffer += chunk.toString();
// Loop to find and parse complete JSON objects from the buffer
while (true) {
const result = findAndParseJsonObject(buffer);
if (result) {
yield result.value; // Send a value to the consumer
buffer = buffer.slice(result.endIndex); // Consume buffer
} else {
break; // Need more data
}
}
}
// Handle any remaining data in the buffer...
}
3ļøā£ Hybrid Approach ā Multiple Streams
Sometimes youāre not just parsing a single stream; youāre combining several. The same asyncāgenerator pattern scales nicelyājust await each source in turn or merge them with Promise.race/Promise.all. The adapter can be reused for each source, keeping the public API consistent.
Closing Thoughts
- Adapter = quick win, zeroārisk, immediate ergonomics.
- Full refactor = longāterm maintainability, idiomatic TS, flatter callāstacks.
- Hybrid = best of both worlds when dealing with multiple inputs.
Pick the path that matches your teamās bandwidth and release cadence. Either way, moving to Async Iterators will make your streaming JSON parser feel native to modern TypeScriptāand far easier for the whole team to understand, debug, and extend.
Streaming Data with Async Iterators vs. RxJS
When dealing with multiple streams of dataāfiltering, mapping, debouncing, etc.āthe choice of abstraction matters.
Async Iterators
- Great for simple, linear pipelines.
- Works well when you have a single source of data (e.g., reading a file from
prodādbā01). - Can become unwieldy for realātime analytics pipelines that need to coordinate several streams.
RxJS
RxJS gives you a richer vocabulary (Observables, Operators) for handling complex asynchronous event streams. Think of it as the āenterpriseāgradeā solution.
import { fromEvent } from 'rxjs';
import { map, filter, takeUntil } from 'rxjs/operators';
const parser = new StreamingParser(); // Your original class
// Convert events to observables
const stream$ = fromEvent(parser, 'value');
const done$ = fromEvent(parser, 'done');
const error$ = fromEvent(parser, 'error');
stream$
.pipe(
takeUntil(done$), // Stop when 'done' fires
// ... other powerful operators like filter(), map(), debounceTime() ...
)
.subscribe({
next: value => console.log('Got value:', value),
error: err => console.error('Stream error:', err),
complete: () => console.log('Stream complete!')
});
// Handle errors from the error event
error$.subscribe(err => { throw err; });
Warning: Donāt reach for RxJS first. Itās powerful but has a steep learning curve and adds a significant dependency. Using it for a simple pictureāframeāhanging task is like wielding a sledgehammer for a nail. However, when you need to orchestrate multiple complex streams, it becomes a lifesaver.
Recommendation
For your situation, Solution #2 (the RxJS approach) is the destination. It yields the most idiomatic, maintainable, and modern TypeScript codeāsomething any developer on your team will immediately understand.
Start with Solution #1 if you need to ship a better API today without a big refactor, then migrate to RxJS when the complexity grows.
āIdiomaticā code isnāt just about following rules. Itās about empathyāwriting code that aligns with the expectations and patterns of the ecosystem so the next person on call at 2āÆAM (who might be you!) can solve the problem instead of fighting the codeās dialect. Welcome to TypeScriptāweāre glad to have you.
š Read the original article on TechResolve.blog
ā Support my work
If this article helped you, you can buy me a coffee:
š