Cool, that does sound like what I was after.

To expand a bit on this, what I'm thinking about is a modular monolith architecture. It's also a pragmatic approach where you don't need to split into separate (micro)services yet, but you still want things to be modular and able to split later if need be.

While things are still in the same monolith there's no point actually doing the serialise/deserialise step to enable integration between modules, so you can just have modules call each others services directly. Having the automatic DTO means in a service you could just do something like:

    def get_all() -> Iterable[ModelDTO]:
       for obj in Model.objects.as_dtos():
          yield obj

This same service could then be used in the router which would perform the extra serialisation step which would, of course, still work fine on the very same DTOs.

I tend to find the approach of "clean" domain model (using e.g. attrs or dataclasses), SQLAlchemy for db persistence (in classic mapping mode), and serialisation (using e.g. cattrs) a more elegant architecture, as opposed to shipping serialisation and persistence around with every object. But I know people struggle with such a rigid up-front architecture and most prefer Django, so I'm always looking for a pragmatic middle ground.

The way I see it, having everything as Pydantic makes this natural. Your DB model, your request schema, your response DTO are all BaseModels. Converting between them is just model_dump() and model_validate(), or plain inheritance. No adapters, no mapping layers. So when you need to split things apart, it's straightforward rather than painful.