The language using the fewest punctuation tokens is going to be the safest from most categories of hallucination, and give each context window the greatest usable space for vector manipulation headed into self-attention before the model suffers from "vector-clouded judgment" due to overcrowded latent space.