I love genetic algorithms and find using LLMs as part of them super compelling. I always find the fitness functions to be the most difficult part. The algorithm naturally tries to exploit any little gap you leave it in cheating. Best part is not needing back propagation in solving a problem. However that is also the worst part in all the solutions just being one level above a random walk. The LLM augmentation really helps to give it a gradient and intelligence beyond random chaos. Love the idea of it being applied to hardware.
> I love genetic algorithms
Is this a "genetic algorithm" though? Besides "select the best performing run", it doesn't seem to have anything to do with crossovers, mutations, etc at all, just "select best", which makes it seem less of a genetic algorithm to me I guess. Might just be me being confused about what counts as an "genetic algorithm" vs not though, I won't claim to be an expert in the field exactly.
i think there's a lack of verbiage for "optimization that doesnt involve gradient descent"
i think there might be a conflation of "problems a genetic algorithm could be used for" vs "using the genetic algorithm to solve the problem"