I started to suspect a few paragraphs in that this post was written with a lot of AI assistance, but I continued to read to the end because the content was interesting to me. Here's one point that resonated in particular:

"There’s a missing primitive here: a secure, portable memory layer that works across apps, usable by the user, not locked inside the provider. No one’s nailed it yet. One panelist said if he weren’t building his current startup, this would be his next one."

thanks, I used AI but aren't we all? I thought the point of AI is to get us to be more productive. But that's only after I came up with the questions for the speakers and I wrote a draft of the blog, and the penelists read it, added comments and I published. It seems I get a lot of hate here for it, but I am happy with the number of engineers and founders sharing feedback that this was useful to them. I'm not forcing anyone to read my content, but if people want to put the time to hate on it, it's their choice.

[deleted]

Isn’t that markdown files?

I was thinking about consumer-facing AI products, where md files controlled by the user presumably wouldn’t fly.

I find it annoying that, when prompting ChatGPT, Claude, Gemini, etc. on personal tasks through their chat interfaces, I have to provide the same context about myself and my job again and again to the different providers.

The memory functions of the individual providers now reduce some of that repetition, but it would be nice to have a portable personal-memory context (under my control, of course) that is shared with and updated semiautomatically by any AI provider I interact with.

As isoprophlex suggests in a sister comment, though, that would be hard to monetize.

Brb going to squat openmemory.org

Edit: Aaaand it’s gone.

Sheesh how ever will you monetize a text file

Will someone please think of the MRR!