Agree. It's a context/memory issue. Soon LLMs will have a 10M context window and they won't need to search. Most codebases are less than 10M tokens.
Agree. It's a context/memory issue. Soon LLMs will have a 10M context window and they won't need to search. Most codebases are less than 10M tokens.
When dependencies are factored in, I don't know if this is true.