Oh that's interesting: it uses two similarly shaped Git repos with one being transparently encrypted.

I take it that way you can deduplicate backups.

My own backup procedure uses Git too: but I don't deduplicate. For every Git repo my backup procedure (which runs from an OCI containers and mounts the Git repo volume read-only) does:

    - git clone the entire repo
    - clean any mess
    - run git fsck (you never know you know)
    - tar + compress
    - encrypt
    - add a cryptographic sum to the unique resulting encrypted file
    - deduplicate (but this only works if zero change were made to the Git repo since the last backup)
    - send if not a duplicate
    - on the remote end: verify the cryptographic hash, decrypt, uncompress/untar, re-run git fsck (you never know you know)
This way I'm reasonably sure the backup is not corrupted.

FWIW the cryptographic hash allowed me to catch a bit flip, once.

So my files are encrypted too but my main issue is any change to a Git repo means a new full backup has to be made.

I can see how something automated like myba could help.