Ran the tests again with some more files, this time decompressing the pdf in advance. I picked some widely available PDFs to make the experiment reproducable.
file | raw | zstd (%) | brotli (%) |
gawk.pdf | 8.068.092 | 1.437.529 (17.8%) | 1.376.106 (17.1%) |
shannon.pdf | 335.009 | 68.739 (20.5%) | 65.978 (19.6%) |
attention.pdf | 24.742.418 | 367.367 (1.4%) | 362.578 (1.4%) |
learnopengl.pdf | 253.041.425 | 37.756.229 (14.9%) | 35.223.532 (13.9%) |
For learnopengl.pdf I also tested the decompression performance, since it is such a large file, and got the following (less surprising) results using 'perf stat -r 5': zstd: 0.4532 +- 0.0216 seconds time elapsed ( +- 4.77% )
brotli: 0.7641 +- 0.0242 seconds time elapsed ( +- 3.17% )
The conclusion seems to be consistent with what brotli's authors have said: brotli achieves slightly better compression, at the cost of a little over half the decompression speed.