> Without the transformer architecture they invented by funding basic research, there would be no modern LLMs.

one could argue that transformers are nothing without attention layer, which was not invented at google.