Exactly, 1M context tokens is marketing, relatively little training was done at that input size.