Typst looks really promising, especially due to the fact that it had common templates (like the IEEE one) which produce content identical to LaTeX.

My biggest gripe with latex is the tooling. During my last paper, I ended up using a makefile which would usually work. When it didn’t work, running it twice would fix the issue. In the rarest cases, I had to run `git clean -xdf` and the next run would work.

I still have no idea what was going on, and most makefiles out there seem to be obscenely complex and simply parse the output and run the same commands again if a certain set of errors occurred.

The definition of insanity is doing the same thing twice and expecting different results.

By coincidence, this is the basic way to compile latex.

TBF Typst internally also recompiles a bunch of times until a fixpoint is reached, however it is designed to limit what parts can depend on the previous iterations and to reuse previous results for parts that definitely didn't change.

My makefiles ran it 4 times, i think. I still preferred it to Word.

Anything is preferable to Microsoft Word.

Last time I checked, flipping a coin twice gave different results.

Did you flip the coin in the exact same way? Probably not.

If you flip it an infinite number of times, you will get the same results anyway. Call when you're finished :P

Aren’t getting different results the norm in programming anyway? Developers usually don’t make the effort to include idempotency and make builds reproducible.

Normally, if you compile the same code twice on the same machine, you'll get the same result, even if it's not truly reproducible across machines or large gaps in time. And differences between machines or across time are usually small enough that they don't impact the observed behavior of the code, especially if you pin your dependencies.

However, with LaTeX, the output of the first run is often an input to the second run, so you get notably different results if you only compile it once vs. compiling twice. When I last wrote LaTeX about ten years ago, I usually encountered this with page numbers and tables of context, since the page numbers couldn't be determined until the layout was complete. So the first pass would get the bulk of the layout and content in place, and then the second pass would do it all again, but this time with real page numbers. You would never expect to see something like this in a modern compiler, at least not in a way that's visible to the user.

(That said, it's been ten years, and I never compiled anything as long or complex as a PhD thesis, so I could be wrong about why you have to compile twice.)

I wrote my PhD (physics) in LaTeX and I indeed needed to compile twice (at least) to have a correct DVI file.

It was 25 years ago, though, but apparently this part did not change.

This said, I was at least sure that I would get an excellent result and not be like my friend who used MS Word and one day his file was "locked". He could not add a letter to it and had to retype everything.

Compared to that my concern about where a figure would land in the final document was nothing.

That's not the definition of insanity, that's the definition of practicing.

Almost every compiler is a multipass compiler.

But in this case the passes are manual!

so?

Dunno - to me it feels like the latex compiler should just run whatever it needs to for however many times until the output is done/usable, like basically all other compilers?

> My biggest gripe with latex is the tooling. During my last paper, I ended up using a makefile which would usually work. When it didn’t work, running it twice would fix the issue. In the rarest cases, I had to run `git clean -xdf` and the next run would work.

I always feel like I’m doing something wrong when I have to deal with LaTeX and lose hours to fighting with the tooling. Even with a clean install on a new machine it feels like something fails to work.

The last time I had to change a document I had to go through what felt like 100 different search results of people with the same issue before I found one where there was a resolution and it was completely obscure. I tried to help out by reposting the answer to a couple other locations, but I was so exhausted that I swore off LaTeX for any future work unless absolutely unavoidable.

I still dislike the idea that my document formatting and layout system really needs a build environment. Because let's be real, almost nobody actually needs it for genuine typesetting. I think the problem with LaTeX is that it's too flexible.

It reminds me a little bit of the problem of Linux distributions. Linux is supposed to be a system with the bazaar model instead of the cathedral model. Except what you actually end up with is that each distribution becomes it's own cathedral, because building a whole system now requires major decisions to be made. LaTeX class files feel like the same thing.

Many people use LaTeX via Overleaf (a Website, cf. https://overleaf.com ) rather than installing it locally.

That also solves the problem of having to install locally vairous extension packages or fonts - all is already there, and after writing a paper you may submit it directly to some conferences or journals from that Web GUI instead of having to email it or upload to a third site.

Absolutely not a perfect solution, and maybe you're already using it within your Makefiles, but for anyone who doesn't yet know about it there's Latexmk[1] which is supposed to automate all of this hassle. I think at least on Debian it's included with texlive-full. In addition it has some nice flags like `-outdir` which lets you send all the crazy LaTeX intermediate build/aux files to a separate directory that's easy to gitignore.

https://mgeier.github.io/latexmk.html#running-latexmk

LaTeX needs several passes to compile because it was designed with minicomputers of the 80s in mind, i.e. tiny memory constraints.

Latexmk is one way to address this problem. A good IDE like AUCTeX can also figure out how many times the compiler should be invoked.

Good IDEs will also provide other invaluable assistance, like SyncTeX (jumping from source to exact point at PDF, and back).

> LaTeX needs several passes to compile because it was designed with minicomputers of the 80s in mind, i.e. tiny memory constraints.

That's certainly part of it, but any typesetting program will need multiple passes to properly handle tables of contents—you can't know a section's page number until you've compiled everything before that section (including the table of contents), but adding a new section to the contents could push everything ahead by another page. The only unique thing about LaTeX here is that it directly exposes these multiple passes to the user.

As long as it’s deterministic it should be fine. Run the commands in a makefile. Doesn’t sound like the case.

I think I used to understand this, but it's been a long time since I had to write any serious LaTeX, so I don't anymore. I found this snippet in my personal _quick-build-latex_ script from over a decade ago:

    if [ -z "$(find . -name "*.bib" -print0)" ]; then
        # Just two runs, to cover TOC building, etc.
        pdflatex -interaction=nonstopmode "$SOURCE_FILE" && \
        pdflatex -interaction=nonstopmode "$SOURCE_FILE"
    else
        pdflatex -interaction=nonstopmode "$SOURCE_FILE" && \
        bibtex "$SOURCE_FILE" && \
        pdflatex -interaction=nonstopmode "$SOURCE_FILE" && \
        pdflatex -interaction=nonstopmode "$SOURCE_FILE"
    fi
So I guess if you're using bibtex, then you need to run it three times, but otherwise only twice?

This is to say... I'm glad those days are gone.

There can still be cases where a fourth run is necessary, theoretically a fifth run. There are even cases where you get into an infinite loop, for example if you use the vref package. It will "cleverly" replace references to things like "figure 3 on the next page" or "figure 3 on page 8". When the reference is expanded, it might cause the figure to move to the following page, which means the reference is then contracted to "on page 8", which means the figure moves back to the original place again, in which case the reference must be updated, and so on ...

LaTeX will usually tell you by including a warning in the output ("LaTeX Warning: Label(s) may have changed. Rerun to get cross-references right."), which no one reads, because it is so verbose. Not having that warning is not a guarantee that it's now stable either, so our Makefile actually compares the PDF files minus variable bytes like timestamps to know whether the build converged.

Why didn't you use latexmk? It deals with the recompiling for you.

Just use Tectonic nowadays for compiling LaTeX source. It automatically handles these cases of compiling multiple times.

+1 for tectonic, it's miles better than the alternatives for every single case, in my experience. Development seems to have slowed down lately, which is a shame.

One of the things that really interests me about Typst is that the compile process seems much more deterministic and modern

What do you mean by tooling? I've used LaTeX for decades to write books and papers and the combination with Emacs was flawless. The only major change for me was the transition from Bibtex to Biblatex.