Tangential, do LLMs pick up new languages that have less internet discussion and which develop rapidly after knowledge cutoff dates? To naysayers, AIs are supposed to generate hands with 6 fingers and ossify language and framework versions.

Maybe if it's completely distinct. Else definitely no, unless, maybe, if the model is fine-tuned. Had a discussion about it with my dad whos work is developing in a non-mainstream SmallTalk dialect where it doesn't work at all.

I have so far not been able to get a major LLM to generate fully functional Typst code, no matter how much context I try to put into it. The models do not seem to currently understand Typst's concept of modes (code, markup and math) and especially in code mode suffers from heavy hallucination of syntax and/or semantics.

I suppose it also depends on the specific LLM; the output of a free/low-cost model will likely be very different from a $200/month o1-pro.