That `llms-full.txt` is huge. Wouldn’t that completely fuck up your context windows since you have to include it in every request? Even with prompt caching, it still takes up the same amount of tokens, no?