Sure. This was the paper[0]. Here's a few more you might find these interesting. Google's Transformer GAN[1] (not a transformer at all resolutions). Diffusion-GAN[2] is a hybrid architecture. Remember that technically the GAN process can use any underlying architecture. Arguably you could say some of the training steps in LLMs are GANs. And I think this one is also interesting in a similar respect[3]. Before PWC went down, StyleSAN[4] was the SOTA on FFHQ, but IIRC this doesn't change the architecture so it should probably work on all the other architectures too (comes with compute costs, but I think only training. It's been a bit since I read it)
[0] https://arxiv.org/abs/2211.05770
[1] https://arxiv.org/abs/2106.07631
[2] https://arxiv.org/abs/2206.02262
Thank you for spending your time to answer my questions.