While still far from perfect, benchmarks comparing actual, practical usage, like the ammount of request served by webservers for example, are much better indicators imo.
Yeah everyone says that and stops there which is absolutely useless. Benchmarks are at least objective ways to measure something. And there are no "correct" benchmarks. Unless you have better metrics or another way to prove things, please stop repeating these meaningless words
I've literally pasted benchmarks measuring actual job (webserver request per second) in a comment below.
But besides, the critique isn't meaningless even without providing a better one;
If your benchmark is measuring things that are trivial no matter the language (like stack-based operations), but ignores things that actually differ meaningfully (like handling of heap objects), then criticizing such aproach is perfectly fair and valid objection
What method do you propose to compare the performance of different programming languages?
While still far from perfect, benchmarks comparing actual, practical usage, like the ammount of request served by webservers for example, are much better indicators imo.
https://web-frameworks-benchmark.netlify.app/result
[dead]
Yeah everyone says that and stops there which is absolutely useless. Benchmarks are at least objective ways to measure something. And there are no "correct" benchmarks. Unless you have better metrics or another way to prove things, please stop repeating these meaningless words
I've literally pasted benchmarks measuring actual job (webserver request per second) in a comment below.
But besides, the critique isn't meaningless even without providing a better one; If your benchmark is measuring things that are trivial no matter the language (like stack-based operations), but ignores things that actually differ meaningfully (like handling of heap objects), then criticizing such aproach is perfectly fair and valid objection