You are screwed either way. If you don't update your container has a ton of known security issues, if you do the container is not reproducable. reproducable is neat with some useful security benefits, but it is something a non goal if the container is more than a month old - day might even be a better max age.

Why is there a need for a package manager inside a container at all? Aren't they supposed to be minimal?

Build your container/vm image elsewhere and deploy updates as entirely new images or snapshots or whatever you want.

Personally I prefer buildroot and consider VM as another target for embedded o/s images.

So if i have a docker container which needs a handful of packages, you would handle it how?

I'm handling it by using a slim debian or ubuntu, then using apt to install these packages with necessary dependencies.

For everything easy, like one basic binary, I use the most minimal image but as soon as it gets just a little bit annoying to set it up and keep it maintained, i start using apt and a nightly build of the image.

I update my docker containers regularly but doing it in a reproducible, auditable, predictable way

Could you explain how you achieve this?

If you are on github/gitlab, renovate bot is a good option for automating dependency updates via PRs while still maintaining pinned versions in your source.

Chainguard, Docker Inc’s DHI etc. There’s a whole industry for this.