Here are a few options, from most direct to more story-driven: **Option 1 (Recommended):** Solved: From Dart to TypeScript: ...

Published: (March 2, 2026 at 03:59 AM EST)
7 min read
Source: Dev.to

Source: Dev.to

šŸš€ Executive Summary

TL;DR: A Dart developer’s streaming JSON parser in TypeScript, using event‑emitter callbacks, is not idiomatic for modern TS. This article guides the transition from callback‑based patterns to Async Iterators (for await … of loops), offering adapter patterns or full refactors for cleaner, more maintainable code.

  • Modern TypeScript favors Async Iterators over event‑emitter callbacks (onValue, onError, onDone) for handling data streams, resulting in flatter, easier‑to‑read code.
  • The Adapter Pattern lets you wrap an existing callback‑based streaming parser with an AsyncGenerator, instantly providing an idiomatic for await … of API without a full rewrite.
  • Refactoring to a native async function* is the most idiomatic solution: you can yield parsed values directly and use standard try/catch for error handling.
  • A senior engineer breaks down the shift from event‑driven patterns to modern async iterators, guiding a Dart developer on how to write truly ā€œidiomaticā€ TypeScript for a streaming JSON parser.

A Story Worth Remembering

I recall a 2 AM incident call. The auth-service-v2 was melting down, and the on‑call junior dev was completely lost. The service was written in Node.js, but it was structured like a Spring Boot application—complete with dependency‑injection containers and factory patterns that felt alien.

The original author was a Java dev who transplanted their patterns into Node. The code worked, but nobody on the team knew how to debug it. It was a ghost ship in our own fleet.

That’s the exact feeling I get when I see code that’s technically correct but culturally foreign. It’s not about being ā€œwrongā€; it’s about being maintainable for the team you’re on.


From Callbacks to Async Iterators

You’ve come from Dart and built a slick streaming parser in TypeScript. You used an event‑emitter style API with onValue, onError, and onDone callbacks. This is a classic, battle‑tested pattern—how things were done in Node.js for years.

However, the JavaScript/TypeScript world has evolved significantly with the introduction of async/await. The modern, idiomatic way to handle streams of data isn’t through callbacks, but through Async Iterators. They let you treat a stream of data just like an array, using a simple for await … of loop. This makes the code cleaner, easier to reason about, and avoids the nesting that can come with callbacks.

Your API isn’t bad; it just speaks an older dialect. Let’s get you fluent in the modern tongue.


Three Ways to Tackle This

1ļøāƒ£ Quick Patch – Adapter (Non‑Destructive)

You don’t always have time for a full rewrite, especially if the core logic is complex. The fastest way to make your existing parser feel more idiomatic is to wrap it. Create a function that instantiates your event‑based parser and returns an AsyncGenerator. This adapter translates the ā€œoldā€ event style into the ā€œnewā€ iterator style without touching the core implementation.

// Your existing parser class (simplified)
class StreamingParser {
  constructor() { /* ... */ }
  write(chunk) { /* ... */ }
  on(event, callback) { /* ... */ }
}

// The adapter function
export async function* parseJsonStream(readable) {
  const parser = new StreamingParser();

  // A little queue to handle backpressure and race conditions
  const queue: any[] = [];
  let done = false;
  let error: any = null;
  let resolvePromise = () => {};

  parser.on('value', (value) => {
    queue.push(value);
    resolvePromise();
  });

  parser.on('error', (err) => {
    error = err;
    resolvePromise();
  });

  parser.on('done', () => {
    done = true;
    resolvePromise();
  });

  // Pipe the source readable stream into the parser
  readable.on('data', (chunk) => parser.write(chunk));
  readable.on('end', () => parser.end());

  while (!done) {
    while (queue.length > 0) {
      yield queue.shift();
    }

    if (error) {
      throw error;
    }

    if (done) break;

    // Wait for the next event
    await new Promise((resolve) => {
      resolvePromise = resolve;
    });
  }
}

Pro Tip: Ship the adapter immediately to provide an idiomatic API for new consumers, while planning a deeper refactor of the core class for a future release. It keeps everyone happy.


2ļøāƒ£ Full Refactor – Native Async Generator

The goal is to refactor the parser’s internal logic to be a native async function*. This eliminates the need for managing event listeners and state manually. The yield keyword effectively ā€œpausesā€ your function and hands a value back to the consumer, resuming only when the consumer asks for the next item in the for await … of loop.

Consumer Code Before (Callback Style)

const parser = new StreamingParser();

parser.on('value', (val) => {
  console.log('Got a value:', val);
});
parser.on('error', (err) => {
  console.error('Oh no:', err);
});
parser.on('done', () => {
  console.log('All done!');
});

stream.pipe(parser);

Consumer Code After (Idiomatic Async Iterator)

try {
  for await (const value of parseJsonStream(stream)) {
    console.log('Got a value:', value);
  }
  console.log('All done!');
} catch (err) {
  console.error('Oh no:', err);
}

The refactored code is flat, uses standard try/catch for error handling, and is much easier to follow.

Core Parser Refactor (Conceptual)

export async function* parseJsonStream(stream) {
  let buffer = '';
  // ... other state variables ...

  for await (const chunk of stream) {
    buffer += chunk.toString();

    // Loop to find and parse complete JSON objects from the buffer
    while (true) {
      const result = findAndParseJsonObject(buffer);

      if (result) {
        yield result.value;               // Send a value to the consumer
        buffer = buffer.slice(result.endIndex); // Consume buffer
      } else {
        break; // Need more data
      }
    }
  }

  // Handle any remaining data in the buffer...
}

3ļøāƒ£ Hybrid Approach – Multiple Streams

Sometimes you’re not just parsing a single stream; you’re combining several. The same async‑generator pattern scales nicely—just await each source in turn or merge them with Promise.race/Promise.all. The adapter can be reused for each source, keeping the public API consistent.


Closing Thoughts

  • Adapter = quick win, zero‑risk, immediate ergonomics.
  • Full refactor = long‑term maintainability, idiomatic TS, flatter call‑stacks.
  • Hybrid = best of both worlds when dealing with multiple inputs.

Pick the path that matches your team’s bandwidth and release cadence. Either way, moving to Async Iterators will make your streaming JSON parser feel native to modern TypeScript—and far easier for the whole team to understand, debug, and extend.

Streaming Data with Async Iterators vs. RxJS

When dealing with multiple streams of data—filtering, mapping, debouncing, etc.—the choice of abstraction matters.

Async Iterators

  • Great for simple, linear pipelines.
  • Works well when you have a single source of data (e.g., reading a file from prod‑db‑01).
  • Can become unwieldy for real‑time analytics pipelines that need to coordinate several streams.

RxJS

RxJS gives you a richer vocabulary (Observables, Operators) for handling complex asynchronous event streams. Think of it as the ā€œenterprise‑gradeā€ solution.

import { fromEvent } from 'rxjs';
import { map, filter, takeUntil } from 'rxjs/operators';

const parser = new StreamingParser(); // Your original class

// Convert events to observables
const stream$ = fromEvent(parser, 'value');
const done$   = fromEvent(parser, 'done');
const error$  = fromEvent(parser, 'error');

stream$
  .pipe(
    takeUntil(done$),               // Stop when 'done' fires
    // ... other powerful operators like filter(), map(), debounceTime() ...
  )
  .subscribe({
    next:    value => console.log('Got value:', value),
    error:   err   => console.error('Stream error:', err),
    complete: ()   => console.log('Stream complete!')
  });

// Handle errors from the error event
error$.subscribe(err => { throw err; });

Warning: Don’t reach for RxJS first. It’s powerful but has a steep learning curve and adds a significant dependency. Using it for a simple picture‑frame‑hanging task is like wielding a sledgehammer for a nail. However, when you need to orchestrate multiple complex streams, it becomes a lifesaver.

Recommendation

For your situation, Solution #2 (the RxJS approach) is the destination. It yields the most idiomatic, maintainable, and modern TypeScript code—something any developer on your team will immediately understand.

Start with Solution #1 if you need to ship a better API today without a big refactor, then migrate to RxJS when the complexity grows.

ā€œIdiomaticā€ code isn’t just about following rules. It’s about empathy—writing code that aligns with the expectations and patterns of the ecosystem so the next person on call at 2 AM (who might be you!) can solve the problem instead of fighting the code’s dialect. Welcome to TypeScript—we’re glad to have you.


šŸ‘‰ Read the original article on TechResolve.blog

ā˜• Support my work
If this article helped you, you can buy me a coffee:
šŸ‘‰

0 views
Back to Blog

Related posts

Read more Ā»

Drizzle joins PlanetScale

I am excited to announce that the Drizzlehttps://orm.drizzle.team/ team is joining PlanetScale to continue their mission of building the best database tools for...

Sprint

!Sprint: Express sin repetir códigohttps://dev-to-uploads.s3.amazonaws.com/uploads/articles/9mcbu1c3wuvlq0tiuup0.png Introduction Sprint: deja de repetir código...