There is a big "WARNING: This is a beta. Work in progress" message in https://github.com/cloudflare/workerd
Specifically, half of the services operate locally, and the other half require CF services. I mainly use Claude Code to develop, and it often struggles to replicate the local environment, so I had to create another worker in CF for my local development.
Initially, the idea was to use CF for my side projects as it's much easier than K8S, but after wrestling with it for a month, decided that it's not really worth investing that much, and I moved back to using K8S with FluxCD instead, even though it's overkill as well.
> There is a big "WARNING: This is a beta. Work in progress"
Ughhhh that is because nobody ever looks at the readme so it hasn't been updated basically since workerd was originally released. Sorry. I should really fix that.
> Specifically, half of the services operate locally, and the other half require CF services.
workerd itself is a runtime for Workers and Durable Objects, but is not intended to provide implementations of other services like KV, D1, etc. Wrangler / miniflare provides implementations of most of these for local testing purposes, but these aren't really meant for production.
But workers + DO alone is enough to do a whole lot of things...
Thanks a ton for the quick response! I totally get that workerd is not intended to be the emulator of all CF services, but the fact that I will still need an external dependency for local development, and the code I developed can't be used outside of CF environment, makes me feel like I'm locked in to the environment.
I'm mostly using terminal agents to write and deploy code. I made a silly mistake, not reviewing the code before merging it into main (side project, zero user), and my durable object alarms got into an infinite loop, and I got a $400 bill in an hour. There was no way to set rate limits for AI binding in workers, and I didn't get any notification, so I created a support ticket 2 months ago, which hasn't answered to this date.
That was enough for me to move out of CF as a long-time user (>10 years) and believer (CF is still one of my biggest stocks). In a world where AI writes most of the code, it's scary to have the requirement to deploy to a cloud that doesn't have any way to set rate limits.
I learned the hard way that I must use AI Gateway in this situation, but authentication is harder with it, and agents prefer embedded auth, which makes it pick AI binding over AI Gateway. With K8S, it's not easy to maintain, but at least I can fully control the costs without worrying about cost of experimentation.