I dont know where Gemini stores the context, but if I’m using a local LLM client app, that context is on my machine verbatim.

If you ask the LLM to give you back that context does it give back to you verbatim?

statistically, maybe.

Stochastically correct is the best sort of correct?