so runtime use cases like LLMs can be supported, in the Chat example you can interact with a real llm!