Maybe read up on how transformers, their encoders and decoders, and the attention matrix works?

https://arxiv.org/abs/1706.03762