I'd take tests over docs but that's a false dilemma.
What does the (Copilot) /tests command do, compared to a prompt like "Generate tests for #symbolname, run them, and modify the FUT function under test and run the tests in a loop until the tests pass"?
Documentation is probably key to the Django web framework's success, for example.
I'd take tests over docs but that's a false dilemma.
What does the (Copilot) /tests command do, compared to a prompt like "Generate tests for #symbolname, run them, and modify the FUT function under test and run the tests in a loop until the tests pass"?
Documentation is probably key to the Django web framework's success, for example.
Resources useful for learning to write great docs: https://news.ycombinator.com/item?id=23945815
"Ask HN: Tools to generate coverage of user documentation for code" https://news.ycombinator.com/item?id=30758645
Context limits (regardless of hard limits) are a show stopper IMO, the models completely fail assignments with >= 30k LoC (or so) codebases.
You're better off feeding them a few files to work with, in isolation, if you can.