I am currently mass translating millions of records with short descriptions. Somehow tokens are consumed extremely fast. I have 3 max memberships. And all 3 of them are hitting the 5 hour limit in about 5 to 10 minutes. Still don't understand why this is happening.

Unless you're clearing up the context for each description or processing them in parallel with subagents your context window will grow for each short description added to it making you hit those hour limits.