You can use generator fns to achieve the exact same thing without magic strings or bundles. And the bonus is that you can debug it.

At Resonate, we are using generators (both for the typescript sdk and the python sdk) to implement Distributed Async Await, Resonate's Durable Execution Framework. Because generators transfer control back to the caller, you can essentially build your own event loop and weave distributed coordination (RPCs and callbacks) and distributed recovery (restarts) right into the execution with fairly little lines of code.

Disclaimer: I'm the CEO of Resonate

I don't believe you can. I believe what they're trying to do is rewrite the underlying function into separate functions that can be called in any order and combination. That's not possible with generators. With a generator, I can pause between steps, but I can't, say, retry the third step four times until it succeeds, suspend the entire process, and then jump straight to the fourth step. Or I can't run different steps in different processes on different machines depending on load balancing concerns.

I suspect that's why they need to transform the function into individual step functions, and for that they need to do static analysis. And if they're doing static analysis, then all the concerns detailed here and elsewhere apply and you're basically just picking your favourite of a bunch of syntaxes with different tradeoffs. I don't like magic strings, but at least they clearly indicate that magic is happening, which is useful for the developer reading the code.

> With a generator, I can pause between steps, but I can't, say, retry the third step four times until it succeeds, suspend the entire process, and then jump straight to the fourth step.

If I recall correctly, other solutions in this space work by persisting & memoizing the results of the steps as they succeed, so the whole thing can be rerun and anything already completed uses the memoized result.

That's what we do at Resonate: We build Distributed Async Await, basically async await with durability provided by Durable Promises. A Durable Promises is the checkpoint (the memoization device). When the generator restarts, the generator skips what has already been done.

We don't have workflow and steps tho, like async await, just functions all the way down.

Disclaimer: I'm the CEO of resonate

You are partly correct, if each step is another generator function you can retry the steps until success or fail, and even create a small framework around it.

That still requires the process running the outer generator to stay active the entire time. If that fails, you can't retry from step three. You need some way of starting the outer function part way through.

You can do that manually by defining a bunch of explicit steps, their outputs and inputs, and how to transform the data around. And you could use generators to create a little DSL around that. But I think the idea here is to transform arbitrary functions into durable pipelines.

Personally, I'd prefer the more explicit DSL-based approach, but I can see why alternatives like this would appeal to some people.

When we built Distributed Async Await, we went a step further: Every time the generator instance awaits, we "kill" the generator instance (you cannot really kill a generator, but you can just let it go out of scope) and create a new one when the awaited computation completes. So in essence, we built resume semantics on top of restart. We were inspired by the paper Crash Only Software https://dslab.epfl.ch/pubs/crashonly.pdf

yeah, that’s what I meant with a tiny framework, you can add features to it and have it implement à la dsl.

These alternatives just hide the implementation and make debugging and debugging expanding/configuring unavailable

Yeah, I completely agree that the implementation hiding makes me very uncomfortable with this sort of approach. It's the same with a lot of Vercel's work - it's very easy to do simple things on the happy path, but the more you stray from that path, the more complex things become.