Coupling together these things is short-sighted, but I get that in simple CRUD backends they really do sometimes stay the same for a long time. As long as there is an easy and obvious way out, then it's probably fine.

The big problem with ActiveRecord style ORMs, though, is the big ball of mud you end up with when anything from anywhere in the code can call `save()` on an object at any time to serialise and persist it in the db. It requires a constant vigilance to stop this happening, but many people don't even try in the first place.

What would be ideal is to have an automatically generated DTO-type object that you can pass to other parts of the code that shouldn't be calling `save()` themselves. It could be a "real" object, or just annotated using a Protocol, such that calling `save()` would be a type error. Django models unfortunately don't work well with Prototypes; mypy isn't able to detect the as supporting the Protocol (see: https://github.com/python/mypy/issues/5481). But having an automatically generated DTO could work too.

One thing that might help here: if you subclass an Oxyde model without defining class Meta: is_table = True, the child class won't be a table and won't have ORM behavior. So you can inherit the fields and validation but without save()/delete(). Not exactly a Protocol-based approach, but it gives you a clean read-only DTO derived from the same model.

Cool, that does sound like what I was after.

To expand a bit on this, what I'm thinking about is a modular monolith architecture. It's also a pragmatic approach where you don't need to split into separate (micro)services yet, but you still want things to be modular and able to split later if need be.

While things are still in the same monolith there's no point actually doing the serialise/deserialise step to enable integration between modules, so you can just have modules call each others services directly. Having the automatic DTO means in a service you could just do something like:

    def get_all() -> Iterable[ModelDTO]:
       for obj in Model.objects.as_dtos():
          yield obj

This same service could then be used in the router which would perform the extra serialisation step which would, of course, still work fine on the very same DTOs.

I tend to find the approach of "clean" domain model (using e.g. attrs or dataclasses), SQLAlchemy for db persistence (in classic mapping mode), and serialisation (using e.g. cattrs) a more elegant architecture, as opposed to shipping serialisation and persistence around with every object. But I know people struggle with such a rigid up-front architecture and most prefer Django, so I'm always looking for a pragmatic middle ground.

The way I see it, having everything as Pydantic makes this natural. Your DB model, your request schema, your response DTO are all BaseModels. Converting between them is just model_dump() and model_validate(), or plain inheritance. No adapters, no mapping layers. So when you need to split things apart, it's straightforward rather than painful.