I really like the idea of it. My dream has always been to work with "types" first and foremost across any and all languages (yep it is a dream). And little tools like these are really nice to see push that boundary.
One feedback - if you are truly comparing with "other" tools - you should be looking at grpc and protoc plugins. I have used to great effect for things like:
1. Generating wasm bindings for grpc services
2. Generating "data access layer" types so you can choose how a api proto is transformed to a data domain type and vice versa
3. MCP bindings for APIs
4. GraphQL/BFF bindings with multiple services
5. All of the above "across" langauges.
The tooling is fantastic and extensible - if you are ok to start with a proto view of your world - it sounds wierd and like an N+1 problem but once you are used to it it is surprisingly fun (ok we may have different ideas of fun)
Totally - The other really nice thing about Golang "type-system" ecosystem is their native ast in the stdlib. You can do so many amazing things from there. Infact if you pledge your life to Go (which I think I have atleast for now) starting from Go and generating everything outwards is not necessarily a bad strategy.
On this “types first across languages”, I’ve been hacking on something in that vein called Kumi (typed calculation schemas compiled to Ruby/JS). Tiny demo here https://kumi-play-web.fly.dev
I am still thinking about some blog post about all the journey but I have never wrote one of those, but see here some write-up I did on a reddit post https://www.reddit.com/r/Compilers/s/osaVyUgvUf
Eh, a lot of those problems have solutions, and some are more theoretical.
Yes, if you design your graph to allow something like `tags -> relatedTags -> relatedTags` ad infinitum you can let your clients build a performance problem. So why not have just a top-level `getRelatedTags(tagName)` query? Or a `getRelatedArticles(articleId)` query? Just because you can have things nested doesn't mean you need to have all the things nested.
The bulk of our REST API is backed by some Rails gem, which allows for a `fields` param. Those `fields` can be deep. As in you could write `getComments?id=1234fields=user.orders`, but do you know what we do? Don't expose `orders` as a field on `User`.
Type-first is cool. But I think I'll always aim to avoid gRPC, at least in part because grpc-web was so completely broken. I also have an instinctive aversion to binary formats. YMMV, just my PoV.
I’ve had a lot of success with grpc web. Had to patch a couple of things along the way. My biggest misgiving is thinking having bigints would be a good idea (it is not a good idea). Aside from that though, I’ve been happy with it. What felt broken to you?
+1 Couple of things I really hate about proto - No generics/templates. No composition of services or mixins (you do have composition in messages but that feels very limited). Also the clunkiness around having to declare more things (try a repeated map field or a map of repeated fields).
My comment about protos was just the spec (and was seperating the binary formats as a different concern). But your concerns are pretty valid.
We've been using guts basically since it was published on GitHub (almost a year now), and it's so nice! We have a "custom POST-based JSON-rpc"-style api, so we have request and response bodies defined as Go types, and are generating the whole TS schema from it.
It basically lets you generate typescript types from your Go types. However, it's very customizable - you can post-process the AST. In our case, we have a custom generic Go type that indicates an optional (not nullable) field, and we can easily translate it to optional TS types (e.g. for sparse updates).
All in all, great tool/library, thanks for building it!
Disclaimer: I know a developer at Coder (not the author), who also recommended me guts back then, but am unaffiliated other than that.
Using Golang as the source is easier for a Golang backend team. Native types are always going to be the most convenient.
The theory when we came out with guts was that Golang's typing system is very simple (this was before generics when it was made). So how hard could it be to write a simple converter?
The first iteration of guts was written in an afternoon. Eventually I moved it to its own repo, mainly for personal use.
So honestly it really comes down to making it as easy as possible to write backend code. Using another spec that autogen's the Golang would be an extra step we did not feel we needed.
I prefer to use Zod or JSON Schema as the source of truth. Then I use QuickType [1] in the build process to generate code in different languages. This lets me share data structures. I mostly do this in Tauri apps to keep the same data shape in TypeScript and Rust. I also use it to define protocols, like for REST APIs.
Here are the advantages of this approach compared to using guts lib:
- I get validation with clear error messages.
- It supports many languages out of the box.
- I don’t need to maintain a custom library.
- JSON Schema is well supported in LLMs (for example, with structured output or vibe coding).
I know there exists more generic typing tools. One idea I had was to make a "guts" to convert Golang -> Generic Typing -> All languages.
The reason to keep Golang as the source of truth is these types originate from the backend. So when you are writing the backend, it is just the easiest place to write and maintain things. No obstacles.
Golang types are also very simple. I imagine almost all Go types can be converted because of that.
If you output Zod you've basically solved Typescript, a Zod schema more or less is a Typescript type and can be explicitly made so with the `infer` capability
I remember someone telling me at a conference that they write all program types (where possible) in Protocol Buffers, since this guarantees that there's a reasonably efficient way of serializing and deserializing anything that they need to in basically any language/platform that they could realistically write software in.
I don't know if I would go that far, but I kind of find the idea interesting; if everything can be encoded and decoded identically, then the choice of language stops mattering so much.
I really like the idea of it. My dream has always been to work with "types" first and foremost across any and all languages (yep it is a dream). And little tools like these are really nice to see push that boundary.
One feedback - if you are truly comparing with "other" tools - you should be looking at grpc and protoc plugins. I have used to great effect for things like:
1. Generating wasm bindings for grpc services
2. Generating "data access layer" types so you can choose how a api proto is transformed to a data domain type and vice versa
3. MCP bindings for APIs
4. GraphQL/BFF bindings with multiple services
5. All of the above "across" langauges.
The tooling is fantastic and extensible - if you are ok to start with a proto view of your world - it sounds wierd and like an N+1 problem but once you are used to it it is surprisingly fun (ok we may have different ideas of fun)
I totally agree a proto first approach to your types can pay back in dividends if you need to serialize over different wires.
This project admittedly was developed to solve a specific need in an existing codebase with a lot existing types.
The codebase is also mostly maintain by the backend Golang engineers. Letting them use their native type system increases adoption and buy in.
Totally - The other really nice thing about Golang "type-system" ecosystem is their native ast in the stdlib. You can do so many amazing things from there. Infact if you pledge your life to Go (which I think I have atleast for now) starting from Go and generating everything outwards is not necessarily a bad strategy.
On this “types first across languages”, I’ve been hacking on something in that vein called Kumi (typed calculation schemas compiled to Ruby/JS). Tiny demo here https://kumi-play-web.fly.dev
Hot damn. Id love to hear the origin story of this.
I am still thinking about some blog post about all the journey but I have never wrote one of those, but see here some write-up I did on a reddit post https://www.reddit.com/r/Compilers/s/osaVyUgvUf
GraphQL has been my holy grail for this. Easy to grok, easy to build types for various languages and keep everything in sync.
GraphQL is awesome if you are a frontend engineer. GraphQL is terrible if you are a backend engineer - https://bessey.dev/blog/2024/05/24/why-im-over-graphql
Eh, a lot of those problems have solutions, and some are more theoretical.
Yes, if you design your graph to allow something like `tags -> relatedTags -> relatedTags` ad infinitum you can let your clients build a performance problem. So why not have just a top-level `getRelatedTags(tagName)` query? Or a `getRelatedArticles(articleId)` query? Just because you can have things nested doesn't mean you need to have all the things nested.
The bulk of our REST API is backed by some Rails gem, which allows for a `fields` param. Those `fields` can be deep. As in you could write `getComments?id=1234fields=user.orders`, but do you know what we do? Don't expose `orders` as a field on `User`.
> Don't expose `orders` as a field on `User`.
Why not use Open API then?
- I've never found any openapi/ swagger generated client as easy to use as a graphql client
- graphql solves a lot of other problems when querying microservices
- graphql gives us PR-level alerts on whether any schema changes are safe
- graphql actually manages our front-end query caching / updating quite nicely
Type-first is cool. But I think I'll always aim to avoid gRPC, at least in part because grpc-web was so completely broken. I also have an instinctive aversion to binary formats. YMMV, just my PoV.
I’ve had a lot of success with grpc web. Had to patch a couple of things along the way. My biggest misgiving is thinking having bigints would be a good idea (it is not a good idea). Aside from that though, I’ve been happy with it. What felt broken to you?
One thing I still struggle to this day is the float/long conversion from json <-> proto. It somehow works and I still untangle the feeling of magic.
Its generated "TypeScript"... wasn't.
+1 Couple of things I really hate about proto - No generics/templates. No composition of services or mixins (you do have composition in messages but that feels very limited). Also the clunkiness around having to declare more things (try a repeated map field or a map of repeated fields).
My comment about protos was just the spec (and was seperating the binary formats as a different concern). But your concerns are pretty valid.
We've been using guts basically since it was published on GitHub (almost a year now), and it's so nice! We have a "custom POST-based JSON-rpc"-style api, so we have request and response bodies defined as Go types, and are generating the whole TS schema from it.
It basically lets you generate typescript types from your Go types. However, it's very customizable - you can post-process the AST. In our case, we have a custom generic Go type that indicates an optional (not nullable) field, and we can easily translate it to optional TS types (e.g. for sparse updates).
All in all, great tool/library, thanks for building it!
Disclaimer: I know a developer at Coder (not the author), who also recommended me guts back then, but am unaffiliated other than that.
Thrilled to see you got value out of it!
Why not use OpenAPI for this - https://ashishb.net/programming/openapi/? OpenAPI supports a lot of languages, not just 2.
Using Golang as the source is easier for a Golang backend team. Native types are always going to be the most convenient.
The theory when we came out with guts was that Golang's typing system is very simple (this was before generics when it was made). So how hard could it be to write a simple converter?
The first iteration of guts was written in an afternoon. Eventually I moved it to its own repo, mainly for personal use.
So honestly it really comes down to making it as easy as possible to write backend code. Using another spec that autogen's the Golang would be an extra step we did not feel we needed.
> Using another spec that autogen's the Golang would be an extra step we did not feel we needed.
No need to. You can start with Go Lang server and generate the spec from it as well.
I prefer to use Zod or JSON Schema as the source of truth. Then I use QuickType [1] in the build process to generate code in different languages. This lets me share data structures. I mostly do this in Tauri apps to keep the same data shape in TypeScript and Rust. I also use it to define protocols, like for REST APIs.
Here are the advantages of this approach compared to using guts lib: - I get validation with clear error messages. - It supports many languages out of the box. - I don’t need to maintain a custom library. - JSON Schema is well supported in LLMs (for example, with structured output or vibe coding).
[1] https://quicktype.io/
I know there exists more generic typing tools. One idea I had was to make a "guts" to convert Golang -> Generic Typing -> All languages.
The reason to keep Golang as the source of truth is these types originate from the backend. So when you are writing the backend, it is just the easiest place to write and maintain things. No obstacles.
Golang types are also very simple. I imagine almost all Go types can be converted because of that.
Nice, this looks interesting.
Somewhat related is a project we worked on within Golang community in Malawi: https://github.com/golang-malawi/geneveev
It supports converting types to Zod schemas and Dart classes. Never got around to TypeScript and would be cool to see if we could add support for guts
If you output Zod you've basically solved Typescript, a Zod schema more or less is a Typescript type and can be explicitly made so with the `infer` capability
I remember someone telling me at a conference that they write all program types (where possible) in Protocol Buffers, since this guarantees that there's a reasonably efficient way of serializing and deserializing anything that they need to in basically any language/platform that they could realistically write software in.
I don't know if I would go that far, but I kind of find the idea interesting; if everything can be encoded and decoded identically, then the choice of language stops mattering so much.
There's also https://github.com/tkrajina/typescriptify-golang-structs
Which is used for example in the Go GUI framwork Wails: https://github.com/wailsapp/wails/tree/v2.11.0/v2/internal/b...
Another one: https://github.com/gzuidhof/tygo
(Shoutout to Guido!)
That thing was too big to be called a library. Too big, too thick, too heavy, and too rough, it was more like a large hunk of code.
In the same vein: https://typespec.io/