I skimmed, for me it was this: https://github.com/barddoo/zedis/blob/87321b04224b2e2e857b67...
There seems to be a fair amount of stigma around using llms. And many people that use them are uncomfortable talking about it.
It's a weird world. Depending on who is at the wheel, whether an llm is used _can_ make no difference.
But the problem is, you can have no idea what you're doing and make something that feels like it was carefully hand-crafted by someone - a really great project - but there are hidden things or outright lies about functionality, often to the surprise of the author. Like, they weren't trying to mislead, just didn't take them time to see if it did all of what the LLM said it did.
Agree. I used it mostly for getting ideas, the memory management for example, Gemini listed so many different ways of managing memory I didn’t even know existed. I know I wanted to pre allocate memory like tigerbeetle does, so the hybrid approach was perfect. Essentially it has 3 different allocators, a huge one for the cache, a arena allocator for context, intermediate state like pub/sub and temp one, for requests. It was 100% Gemini’s idea.
3 months ago I was vibe coding an idea and for some reason (and luck) I went to check a less important part of the code and saw that the LLM changed the env variable of an API key and hard coded the key explictly in the code. That was scary. I'm glad I saw it before PR and shit like that.
I generally do not think it is a bad thing. I use LLMs too and I know what I am doing, so I do not know if it could be qualified as vibe coding.
I think it is not inherently a bad thing to use LLMs, only if you have absolutely no clue about what you are doing, but even then, if the project is usable and as-is advertised, why not? shrugs
As for the link, that is exactly the same code that caught my eye, besides the README.md itself. The LRU eviction thing is what GPT (and possibly other LLMs) always comes up with according to my experiences, and he could have just had it properly implemented then. :D
Edit: I am glad author confirmed the use of an LLM. :P