>The winners will not be the ones who maintain the biggest vector databases, but the ones who design the smartest agents to traverse abundant context and connect meaning across documents.

So if one were building say a memory system for an AI chat bot, how would you save all the data related to a user? Mother's name, favorite meals, allergies? If not a Vector database like pinecone, then what? Just a big .txt file per user?

That is what Claude Sonnet 4.5 is doing: https://youtu.be/pidnIHdA1Y8?si=GqNEYBFyF-3Klh4-

Any kind of database is far too efficient for an LLM, just take all your markdown and turn it into less markdown.

Exactly. Just a markdown file per user. Anthropic recommends that.