I really know almost nothing about complex analysis, but this sure feels like what physicists call observational entropy applied to mathematics: what counts as "order" in ℂ depends on the resolution of your observational apparatus.
The algebraic conception, with its wild automorphisms, exhibits a kind of multiplicative chaos — small changes in perspective (which automorphism you apply) cascade into radically different views of the structure. Transcendental numbers are all automorphic with each other; the structure cannot distinguish e from π. Meanwhile, the analytic/smooth conception, by fixing the topology, tames this chaos into something with only two symmetries. The topology acts as a damping mechanism, converting multiplicative sensitivity into additive stability.
I'll just add to that that if transformers are implementing a renormalization group flow, than the models' failure on the automorphism question is predictable: systems trained on compressed representations of mathematical knowledge will default to the conception with the lowest "synchronization" cost — the one most commonly used in practice.
https://www.symmetrybroken.com/transformer-as-renormalizatio...