From https://www.latex-project.org/about/:

"LaTeX is not a word processor! Instead, LaTeX encourages authors not to worry too much about the appearance of their documents but to concentrate on getting the right content."

IMO, the only people that use LaTeX are people who are willing to trade the convenience and productivity of using a sane document authoring format for the warm and fuzzy feeling you get when you use an outdated piece of typesetting software that is a) hard to configure, b) hard to use and c) produces output for the least useful reading platform available (paged pdfs).

And the pronounciation is stupid.

I tried typst a year ago, and I found it really nice to use compared to Latex. I even managed to make (or modify I don't remember) a small module to customize boxes, something I would not have even though of trying with latex.

I don't use latex anymore and I don't have a use case for typst, so I'm not currently using it, but I follow the advancements from time to time, and I have to disagree with the advisor.

Typst is perfectly fine for replacing latex in almost any place that doesn't require the latex source. The other case is because tthe ecosystem is much smaller so if you need a specific extension that does not exist or is not trivial to implement you'll be out of luck, and you'll be stuck with latex.

Typst looks really promising, especially due to the fact that it had common templates (like the IEEE one) which produce content identical to LaTeX.

My biggest gripe with latex is the tooling. During my last paper, I ended up using a makefile which would usually work. When it didn’t work, running it twice would fix the issue. In the rarest cases, I had to run `git clean -xdf` and the next run would work.

I still have no idea what was going on, and most makefiles out there seem to be obscenely complex and simply parse the output and run the same commands again if a certain set of errors occurred.

The definition of insanity is doing the same thing twice and expecting different results.

By coincidence, this is the basic way to compile latex.

TBF Typst internally also recompiles a bunch of times until a fixpoint is reached, however it is designed to limit what parts can depend on the previous iterations and to reuse previous results for parts that definitely didn't change.

Aren’t getting different results the norm in programming anyway? Developers usually don’t make the effort to include idempotency and make builds reproducible.

My makefiles ran it 4 times, i think. I still preferred it to Word.

Anything is preferable to Microsoft Word.

Last time I checked, flipping a coin twice gave different results.

If you flip it an infinite number of times, you will get the same results anyway. Call when you're finished :P

Did you flip the coin in the exact same way? Probably not.

> My biggest gripe with latex is the tooling. During my last paper, I ended up using a makefile which would usually work. When it didn’t work, running it twice would fix the issue. In the rarest cases, I had to run `git clean -xdf` and the next run would work.

I always feel like I’m doing something wrong when I have to deal with LaTeX and lose hours to fighting with the tooling. Even with a clean install on a new machine it feels like something fails to work.

The last time I had to change a document I had to go through what felt like 100 different search results of people with the same issue before I found one where there was a resolution and it was completely obscure. I tried to help out by reposting the answer to a couple other locations, but I was so exhausted that I swore off LaTeX for any future work unless absolutely unavoidable.

Absolutely not a perfect solution, and maybe you're already using it within your Makefiles, but for anyone who doesn't yet know about it there's Latexmk[1] which is supposed to automate all of this hassle. I think at least on Debian it's included with texlive-full. In addition it has some nice flags like `-outdir` which lets you send all the crazy LaTeX intermediate build/aux files to a separate directory that's easy to gitignore.

https://mgeier.github.io/latexmk.html#running-latexmk

LaTeX needs several passes to compile because it was designed with minicomputers of the 80s in mind, i.e. tiny memory constraints.

Latexmk is one way to address this problem. A good IDE like AUCTeX can also figure out how many times the compiler should be invoked.

Good IDEs will also provide other invaluable assistance, like SyncTeX (jumping from source to exact point at PDF, and back).

> LaTeX needs several passes to compile because it was designed with minicomputers of the 80s in mind, i.e. tiny memory constraints.

That's certainly part of it, but any typesetting program will need multiple passes to properly handle tables of contents—you can't know a section's page number until you've compiled everything before that section (including the table of contents), but adding a new section to the contents could push everything ahead by another page. The only unique thing about LaTeX here is that it directly exposes these multiple passes to the user.

I think I used to understand this, but it's been a long time since I had to write any serious LaTeX, so I don't anymore. I found this snippet in my personal _quick-build-latex_ script from over a decade ago:

    if [ -z "$(find . -name "*.bib" -print0)" ]; then
        # Just two runs, to cover TOC building, etc.
        pdflatex -interaction=nonstopmode "$SOURCE_FILE" && \
        pdflatex -interaction=nonstopmode "$SOURCE_FILE"
    else
        pdflatex -interaction=nonstopmode "$SOURCE_FILE" && \
        bibtex "$SOURCE_FILE" && \
        pdflatex -interaction=nonstopmode "$SOURCE_FILE" && \
        pdflatex -interaction=nonstopmode "$SOURCE_FILE"
    fi
So I guess if you're using bibtex, then you need to run it three times, but otherwise only twice?

This is to say... I'm glad those days are gone.

There can still be cases where a fourth run is necessary, theoretically a fifth run. There are even cases where you get into an infinite loop, for example if you use the vref package. It will "cleverly" replace references to things like "figure 3 on the next page" or "figure 3 on page 8". When the reference is expanded, it might cause the figure to move to the following page, which means the reference is then contracted to "on page 8", which means the figure moves back to the original place again, in which case the reference must be updated, and so on ...

LaTeX will usually tell you by including a warning in the output ("LaTeX Warning: Label(s) may have changed. Rerun to get cross-references right."), which no one reads, because it is so verbose. Not having that warning is not a guarantee that it's now stable either, so our Makefile actually compares the PDF files minus variable bytes like timestamps to know whether the build converged.

Just use Tectonic nowadays for compiling LaTeX source. It automatically handles these cases of compiling multiple times.

Why didn't you use latexmk? It deals with the recompiling for you.

One of the things that really interests me about Typst is that the compile process seems much more deterministic and modern

I'm sticking with LaTeX, not as a fetish, but because journal/conferences still do not accept e.g. typst. Will they ever do? I don't know, depends on their willingness to integrate it into their toolchains I guess?

I sincerely doubt they will: most journals in pure math still do not accept LuaTeX; just think about that.

There are already at least two publishers which accept Typst. So that "ever" part is already covered. But most still don't accept Typst and LaTeX is usually mandatory if the sources are required.

which ones?

IJIMAI (https://typst.app/blog/2025/typst-at-ijimai) and JUTI (https://forum.typst.app/t/juti-call-for-papers-best-paper-aw...).

Admittedly, not the most renowned or most known journals but you have to start somewhere.

That is for sure my biggest concern with typst. I wrote a tool that can convert from typst to latex for final submissions, but it is a bit sketchy and at the moment won't handle math very well. https://gitlab.com/theZoq2/ttt

I'm not familiar with how journal submissions work, but don't you simply submit a pdf at the end? Does it matter what engine you used to render it?

Not only do you need to use LaTeX, but you need to use the journal's class file. Anything else will get rejected.

You normally submit a LaTeX or Word document, and the publisher does the final typesetting. Even in computer science, where people often spend a lot of time tweaking the typesetting, the pdf generated by the authors is essentially a preview. There are often visible differences between it and the publisher's version.

Yeah this is one of the craziest things about the scientific publishing industry.

Journals justify their fees by claiming its for typesetting, but all they are really doing is adding extra work to nit pick bibliography formats and so on (see the comments in this article about sentence case). Nobody cares about that. I don't think anyone even reads "journals" any more (except maybe Nature/Science etc.). They mostly just read individual papers and then there's no consistency to maintain.

In a sane world journals would accept PDFs. They would check that the format roughly matches what they expect but not insist on doing the type setting themselves.

Oh well, maybe one day.

I would note arXiv requires the source as well, and having the source is what is enabling the HTML experiments they're doing.

On consistency, what the journals provide is some level of QA (how much is a function of field and journal, rather than the what is charges), and the template is the journal's brand, so both the authors and journals benefit from the style (I can tell the difference between the different (all similar quality) journals in my field at a glance by the style).

It's also worth noting that there's a whole much of metadata that needs to be collected (whether you agree with it or not, funders require it), so a PDF isn't going to cut it here either.

In case anyone hasn't seen some typst source and renders, here's a few documents I whipped up:

First is based on Todd C. Miller's Latex Resume Template:

- https://typst.app/project/rDUHMUg5vxl4jQ5q2grGPY

Second is a Enduring Power of Attorney:

- https://typst.app/project/rs9ZgGLhgM7iPvFs7PQv5O

Third a will:

- https://typst.app/project/r45dVk6MpLjsoXMvxkTxsE

I’m gradually moving my work over to Typst and it’s been a breath of fresh air. Compiles very quickly.

Perhaps the hardest part has been relearning the syntax for math notation; Typst has some interesting opinions in this space.

I hate a lot of things about LaTeX (also wrote several theses in it, as well as research articles), but the math syntax definitely wasn't one of them. Why on earth would they change it?

Typst looks good, but I'm actually going back to LaTeX but paired with Claude Code in VS Code.

I took a hiatus from LaTeX (got my PhD more than a decade ago). I used to know TikZ commands by heart, and I used to write sophisticated preambles (lots of \newcommand). I still remember LaTeX math notation (it's in my muscle memory, and it's used everywhere including in Markdown), but I'd forgotten all the other stuff.

Claude Code, amazingly, knows all that other stuff. I just tell it what I want and it gets 95% of the way there in 1-2 shots.

Not only that, it can figure out the error messages. The biggest pain in the neck with LaTeX is figuring out what went wrong. With Claude, that's not such a big issue.

I don't know about Claude, but when it comes to LaTeX IDE, I will always recommend TeXStudio over everything else. It handles all the annoying problems of LaTeX setup and compilation, and it provides a discoverable interface with classic-style menus (words, it has words instead of inexplicable little icons!) for various common tasks.

I say that as someone who uses a tricked-out Vim for my own LaTeX workflow, and VS Code for several programming languages.

Claude and the like are a huge problem for new languages that want to do new things. It was bad enough when a LaTeX replacement had to compete with forty-ish years of package development time. Now they also have to compete with the millions of lines of existing code LLMs have hoovered up.

I've done some simple Typst programming via Claude, and it worked fine. I expected it to be ignorant of Typst but that was not the case.

One of the best things about Typst is that most tasks are very simple. Compared to the reams of Latex BS I was replacing, building my book with Typst is momumentally simpler.

Which is good, because we don't want to deal with inferior solutions to typesetting that pop up every few years.

A slight bias in favor of the status quo might be acceptable or even desired. However current LLMs strongly favor traditional languages and are unable to comprehend even modern language features not part of their base training set.

Consider the counterfactual of LLMs being available in the 1990s, trained mainly on the world's C code. Perhaps we would still be exclusively writing C today for new languages' code could not been synthesized as easily or conveniently. It's not just about Typst or typesetting specifically but programming language design in general and that improvements are becoming much harder to push through.

> Perhaps we would still be exclusively writing C today for new languages' code could not been synthesized as easily or conveniently.

I'm not actually sure that would be a bad thing? All the reasons that immediately come to mind to move away from C have to do with ergonomics and safety, the latter largely being a product of the former IMO. If an LLM can ingest my entire codebase and do 90% of the work to get me to the changes I need doesn't that obviate the majority of the motivation to change languages in the first place?

Have you tried typst at least once. You have big words but it is lightyears better than Latex.

Great for code re-use but I agree, terrible for anything new.

One relatively optimistic prediction would be that a few will accept Typst, but latex export from Typst will gradually get more mature, until we end up with a charade where more people use other frontends like Quarto or Typst that output to latex rather than latex themselves for submission into journals - in certain fields. Somewhere after that time, Typst will break through and be generally accepted itself.

mitex is an option [1]. There's no way I could learn another notation, at this point.

[1] https://typst.app/universe/package/mitex/

I'll only say that learning typst is easier than learning LaTeX.

It also has first class support for unicode (as does LaTeX via some packages) which if combined with a suitable keyboard layout makes both writing and reading math source code infinitely more pleasant :)

[deleted]

I have only two peeves with typst.

1. They should have carried forward the latex standard as-is for math, instead of getting rid of the backslash escape sequence, etc.

2. There is no way to share a variable across a file's scope - so can't have a setting that is shared across files - not even with state variables.

Other than this, typst is solid, and with the neovim editor and tinymist lsp, is great to write with.

Regarding point 1: I'm so glad they didn't keep the math syntax, there's finally progress in math text input! E.g. we can now write

  $
    ZZ &= { ..., -1, 0, 1, ... } \
    QQ &= { p/q : p, q in ZZ }
  $

  $
    a = cases(
      0 & quad x <= 0,
      mat(1, 2; 3, 4) vec(x, y) & quad x > 0
    )
  $
instead of

  \begin{align*}
    \mathbb{Z} &= \{ \dots, -1, 0, 1, \dots \}, \\
    \mathbb{Q} &= \left\{ \frac{p}{q} : p, q \in \mathbb{Z} \right\}
  \end{align*}

  \[
    a = \begin{cases}
      0 & \quad x \leq 0, \\
      \begin{pmatrix}1 & 2\\ 3 & 4\end{pmatrix}
      \begin{pmatrix}5\\6\end{pmatrix} & \quad x > 0
    \end{cases}
  \]
Regarding point 2: you can put your settings in a file `settings.typ` and import it from multiple files.
[deleted]

You could use unicode-maths?

| „[…] was a friend telling me his LaTeX thesis took 90 seconds to compile towards the end“

Sure, but in order to iterate you won’t have to compile the whole document but can just keep the chapter you are working on by structuring it with \includes

Glad to hear Typst has people doing serious work with it.

I’ve been able to avoid LaTeX. At uni, I went for org-mode -> LaTeX, which was OK except when my .emacs file was filling up with LaTeX stuff to make random stuff work. To be honest, that means I probably can’t even compile it again if I wanted to.

Typst has been awesome (always ran into LaTeX just being horribly inconsistent when layout stuff) when I’ve used it. Hope it continues.

Typst really does feel refreshing in that sense… way less fiddly and a lot more predictable, especially for layout tweaks

> But in the Bibtex file it is very common for the titles to appear in their original title case form

That is common because they are following the rules about how to steer capitalisation when using bib(la)tex:

- If the entry is in English, and the style demands title case, output as is

- If the entry is in English, and the style demands sentence case, convert to sentence-case and output

- If the entry is not in English, output as is

What deters me from Typst is that latex math syntax is nowadays ubiquitous. You write $x^2=1$ and it renders in many places. Learning a new syntax for math expressions is simply not in my interests.

The threat is literally the opposite, it is very freeing to be able to write typst syntax because it's quicker and easier to write. But then you're cursed by the fact that every other place now uses latex math syntax by convention.

To be fair $x^2=1$ literally works in typst.

It is very fast to learn the Typst math syntax. It is easy and intuitive and usually less verbose than LaTeX. It should not be a difficult thing to learn for most people.

Tangential, do LLMs pick up new languages that have less internet discussion and which develop rapidly after knowledge cutoff dates? To naysayers, AIs are supposed to generate hands with 6 fingers and ossify language and framework versions.

Maybe if it's completely distinct. Else definitely no, unless, maybe, if the model is fine-tuned. Had a discussion about it with my dad whos work is developing in a non-mainstream SmallTalk dialect where it doesn't work at all.

I suppose it also depends on the specific LLM; the output of a free/low-cost model will likely be very different from a $200/month o1-pro.

I have so far not been able to get a major LLM to generate fully functional Typst code, no matter how much context I try to put into it. The models do not seem to currently understand Typst's concept of modes (code, markup and math) and especially in code mode suffers from heavy hallucination of syntax and/or semantics.

In thirty years LaTEX will still be open source and probably will be maintained.

Typst appears to be a mix of open source and closed source; the general model here tends to be neglecting the open source part and implementing critical features in the closed source portion. Which is to say, it's unlikely to live beyond the company itself.

Typst is fully open source licensed under Apache-2.0 license. It is not a mix of any kind. Don't confuse the web app with Typst engine. The web app is a similar service to Overleaf and that is closed source. It is not mandatory, you can use Typst fully on your local machine. The team tries to make money and cover development costs with the web app. But the actual typesetting engine is fully open source and free.

Yep. I wrote an academic paper a few months ago in Typst. I used the VS Code extension for live previewing. All totally opensource and it works great.

https://github.com/overleaf/overleaf hm ?

Overleaf isn't fully open source either, since they have a paid tier with features which are not present in this repo. Inline commenting for example, is a Server Pro -only feature.

The Typst web app, which is similar to Overleaf, is closed source. Overleaf itself is open source, yes.

Read your own link before posting. While the parent was wrong about it being fully closed source the Overleaf editor isn't fully open source either, it is open core under AGPL.

> If you want help installing and maintaining Overleaf in your lab or workplace, we offer an officially supported version called Overleaf Server Pro. It also includes more features for security (SSO with LDAP or SAML), administration and collaboration (e.g. tracked changes). Find out more!

"That" in my sentence meant that Typst web app is closed source.

But that doesn’t make much sense - by your account Latex would also be a mix of closed and open source, since closed source web apps exists for writing Latex.

What does not make sense? Did you mean to answer to someone else? I only stated that Typst (the typesetting engine) is free to use and modify, and only the web app is closed source. Typst can be used without touching any web apps. I use Typst locally.

I made no claims about any mixes or claims about LaTeX.

[dead]

you are wrong. typst's lead dev has stated that an important goal is to have the CLI (which is open source) and web app behave identically, even refusing to implement such a basic feature as PDF embedding because, due to technical reasons, it is currently incompatible with this goal. [1]

typst, the project, is not by any means a "mix" of open and closed, even if typst, the company, is. indeed, the most thorough LSP implementation available (tinymist) is not only open source but a community project. for another funny example see typstify, a paid typst editor not affiliated with the company. [2]

[1]: https://github.com/typst/typst/issues/145#issuecomment-17531...

[2]: https://typstify.com/purchase/

I believe their intentions are good, and keeping functionality the same for different outputs to avoid fragmentation is good too. An alternative interpretation, however, directly in line with the fear expressed by GP, is that they're already crippling the open source CLI because they can't support the feature in the closed source web app.

I disagree. The web app editor is closed source, but much of what it provides is open source so editing is a similar (and imo better) experience locally. The typst compiler and LSP and everything you need to use it is open source.

Imo the situation is more like if overleaf were also the people who made the LaTeX project originally.

I think the only possible issue with the typst org dying (assuming after the full 1.0 version so it's mostly maintenance) is that packages are automatically downloaded from the typst site, but an open repo can trivially be made considering that the set of packages used is just from a open source git repo and the closed source site just hosts tar.gz files of the folders in the repo. Not a big deal I think.

They have a deep incentive to drive users to subscribe, and that's directly at odds with keeping all of the document rendering open source. It makes a lot of sense for them to provide document features that are only available to subscribers.

What you suggest seems plausible, but there is a very good counter example. Overleaf is also managing well by relying on the open-source LaTEX. What drives people to subscribe is not the typesetting itself, but the ecosystem around it (collaborative editing, version management, easy sharing, etc.). You can make money with those and still have the rendering free/open-source. I believe a similar thing is/will be true for Typst as well.

That is a bad counterexample. There is a world of difference between the main devs offering a paid service and some unaffiliated company offering services.

In principle, having a reliable source of funding for typst is great. However, as a journal this would make me hesitant: what if down the road some essential features become subscription-only?

It helps that the LaTeX ecosystem is such a flaming dumpster fire that you all but need a tool like OverLeaf to use it effectively.

They have some incentive to drive users to subscribe, but they have other forms of income, and I think if they ever implemented even a single feature of actual rendering that was closed source their community would riot and we'd get a community managed fork (probably by the guy who does the language server...).

The only way they can continue to gain traction is if they never ever in any way lock people to the web app. Documents must be portable, it's part of why someone would want typst anyways.

I do not see a future where this happens, and if it does it will be because the typst org has changed hands and is also no longer particularly relevant to the future of typst the language.

Is there really a community of volunteer contributors that could fork it if that happened? Typically with a corporate-backed project like this, the corporate development tends to crowd out the formation of a volunteer community of contributors that would be able to take over development.

All the typesetting extensions and such are a community effort. There are so many specific use cases that can only be/will be done by very specialized academics that a non-networked product would die on the vine.

There is quite a clear distinction/border between an Input-Output rendering kind of program sitting beneath everything, and a web service providing stuff like collaborative editing, free hosting etc on top.

That is a real concern, but I wouldn't say there are any critical features in the closed source portion. I wrote the whole thesis locally with only open source tools. One of the included papers was written in the cloud platform for collaboration.

It is a concern that there is a single company doing most of the development, but there is quite a bit of community involvement so I don't think it is an immediate concern

>In thirty years LaTEX will still be open source and probably will be maintained.

The latter is a genuine concern. Will it be maintained? I like LaTeX a lot, but would I want to maintain its internals? No. Could I? If I were paid handsomely, yes. Emphasis on handsomely.

Which leads to another worry: LaTeX itself may be OSS, but down the line it is possible that maintained forks will be controlled by big publishers paying maintainers to deal with the insanity of its internals. And we all know how lovely those publishers are (凸ಠ益ಠ)凸

TeX Live started in 1996, current update release is March, and has active conferences next year. They're an open source group, that has so far survived the test of time, and I'd suggest motivations are there to keep that into the distant future.

Unless academia collapses.

> implementing critical features in the closed source portion

Like which critical features, for example?

For now, that's the entire collaboration component. It would make sense to build a portion of document rendering in that context which won't be found in the open source portions. A value-add to convince users to subscribe.

>For now, that's the entire collaboration component.

And LaTeX has this for free? It's separated concerns, I think the analogy is Overleaf and LaTeX but just happened to be made by the same group of folks, it doesn't have to go down the monetization-at-the-cost-of-your-user route.

> And LaTeX has this for free?

Yes, Overleaf is both free-as-in-beer [0] and free-as-in-speech [1]. The OSS version is pretty easy to self-host, but it's missing quite a few features from the paid version. I still prefer compiling from the command-line for most of my documents, but I run the self-hosted version for collaboration.

[0] https://www.overleaf.com/user/subscription/plans

[1] https://github.com/overleaf/overleaf/

The free plan on overleaf only allows collaboration between 2 people. If you have 3 students in your report assignment then you can't use overleaf for free.

That sounds like a sign that overleaf is struggling, that they had to make that change.

And Typst is more generous there, you can collaborate 3 people with no problem.

> The free plan on overleaf only allows collaboration between 2 people. If you have 3 students in your report assignment then you can't use overleaf for free.

Yup. You used to be able to share projects with unlimited people via link sharing, but they annoyingly got rid of that last year [0]. And Overleaf's cheapest plan is still more expensive than a basic VPS, so it's actually cheaper to self-host (which is what I'm doing [1]).

> That sounds like a sign that overleaf is struggling, that they had to make that change.

Either struggling or realized that they have a captive audience—if your professor requires assignments to be typeset with LaTeX and assigns group projects, there aren't really any other options.

[0] https://www.overleaf.com/blog/changes-to-project-sharing

[1] https://www.maxchernoff.ca/p/overleaf

Actually I've never understood the "free-as-in-beer" thing. Where is beer free?

The term was arguably coined by RMS and his full statement was:

> “Free software” means software that respects users' freedom and community. Roughly, it means that the users have the freedom to run, copy, distribute, study, change and improve the software. Thus, “free software” is a matter of liberty, not price. To understand the concept, you should think of “free” as in “free speech,” not as in “free beer.”

https://www.gnu.org/philosophy/free-sw.html

Sometimes beer happens to be free, in which case it is referred to as "free beer". It's just an example.

I've always understood "free as in beer" as: if someone hands you a beer and says it's free, you know that you don't have to pay to consume the beer, but that doesn't mean that you also get the recipe, brewing instructions, factory plans, glass making instructions etc. The only thing that is free is the liquid itself, nothing else.

Lots of occasions, mostly celebrations or campaigns. It even has its own Wikipedia article:

https://de.m.wikipedia.org/wiki/Freibier

often when you are with friends

i mean that's what overleaf does with latex too, so i don't see the difference

Overleaf is open source.

It is open core.

> neglecting the open source part

So it's no different than fully open sourced projects.

On the flip side, new tools like Typst are trying to push the UX forward in ways that the LaTeX ecosystem often struggles with. I think it comes down to what risks you're comfortable with

Does that matter? The article is in PDF, as other latex generated PDFs.

Yes, obviously. Do you delete all source code once you compiled a binary?

Any future corrections, additions or other modifications are made to the source, not the generated old pdf.

LaTeX is not as stable as people make it out to be.

I don't know how many packages there are for working with tables, but 20 years ago, `tabu` was the most recommended package, until the maintainer stopped responding. Now the package is incompatible with almost everything else, leading to headaches when trying to compile old documents:

https://github.com/tabu-issues-for-future-maintainer/tabu

https://tex.stackexchange.com/questions/470107/incompatibili...

Typst at least has dependency pinning out of the box. If you value reproducibility, you should invent a similar mechanism for your LaTeX documents.

Also, I'm loosely following the activities around LaTeX on Github and Stackexchange and it seems that it's mostly maintained by three people or so (Carlisle, Mittelbach, Fischer), who - no offense - aren't getting any younger. I wonder how well LaTeX will be maintained if these long time contributors have to step down eventually.

Does Mendeley perform any better here than it does with overleaf?

I've used Typst to generate reports in multiple languages and it works pretty well for this! I just pass typst JSON with the report data and use it from there.

Especially in combination with file watching. Your script writes to the JSON file and the entire document and everything that depends on it updates automatically, often in less than a second.

Very cool! I ran into the multiple bibliography issue when attempting to typeset my grandmother's PHD thesis which I was able to rescue from the 5.25" floppies it was originally stored on. I was planning on waiting until they solved this officially to resume that side project, but might give Alexandria a shot!

That sounds like a fun project! Alexandria is the way to go for now but hopefully they will get proper support for it sooner rather than later.

[deleted]

Congrats OP on your PhD!

Is Typst’s typesetting quality on par with « bare » LaTeX ? with LaTeX + microtype ?

It may be stupid and vain but for me if it doesn’t at least match the former it’s a no-go

Until 0.13 it wasn't quite as good as latex in my experience, it mainly inserted more hyphens than LaTeX.

As of this version, it would be very hard to tell a difference in my experience

Yes, it uses a very similar algorith as LaTeX. It also incorporates already some microtype features out of the box. So the typesetting quality is very good and easily comparable to LaTeX. Working with Typst is so much easier and faster than with LaTeX so you will be more productive. Many things can be done without resorting to external packages and scripting is a breeze compared to LaTeX.

Just try it out. It is free, open source and very easy to setup. Just install the extension Tinymist on VSCode, that is all you need.

The ecosystem issues and rough edges in bibliography handling don't surprise me, but the fact that you could script so much directly inside the doc is really appealing

Typist will probably be dead or acquihired in a few years.

Latex will be around for decades.

The Typst compiler is completely open-source. I prefer my local copy of the Typst compiler and CLI to whatever LaTeX provides right now already, and there seems to be a still growing community that could keep the project going even after a malicious acquisition of some kind.

Congratulations to the author.

I have to agree that Typst source generally looks a lot less uglier than LaTeX. I considered writing stuff in Typst many times, but I couldn't master the courage to do so.

Nice debrief. I think tough some of the downsides the author mentions can be addressed relatively easily with quarto, which has embraces Typst since its early days as I recall. Especially the bibliography issue.

I was on the typst train, particularly because its layout engine has some additional vertical control for long documents that latex lacks. However, just about when I was looking at moving over, LLM coding became good or at least good enough, and one area the current crop is bad at is doing layout in anything but latex. Not that they are good at latex, but they are terrible, terrible, terrible at typst. Really bad. Maybe in another year or six months!

I understand why people like using LLMs for coding, saves them having to think, but it is deeply frustrating to see it being such a crutch that some people cannot use new tools without it.

I suppose the issue is not new, many people didn't want to use new lanuages before because they couldn't copy snippets from the internet, but it was frustrating then too.

You’re going to be frustrated a long time into the future I would guess.

I’ve been coding since before the camel book was published: at that time it was basically ask Larry Wall on Usenet or a local bearded guru if you weren’t in a university setting and wanted to learn to code.

I can hand craft code in many a language; I can also do fine wooden joinery. When a project has value to me in the completion and hours to completion is my metric then a cnc machine or an llm is a great tool, and allows me to make things that aren’t “worth” hand coding.

When I want to work on a technical skill or just get in the flow I code by hand or use my wood tools. Upshot: different strokes for different folks.

> and one area the current crop is bad at is doing layout in anything but latex. Not that they are good at latex, but they are terrible, terrible, terrible at typst

I'm surprised to hear that—I've been using GitHub Copilot with ConTeXt [0] since 2021, and it mostly works fairly well. And ConTeXt is much more obscure than Typst (but also much older, so maybe that gives it an advantage?).

[0] https://wiki.contextgarden.net/Introduction/Quick_Start

I’ll tell you my failing prompt - hopefully you can help! I haven’t tried since 4.0 / o3 / 2.5 pro came out.

I want a flowed book layout (so we have a facing page with inner and outer margins.)

I am rendering chats in the main part of the page. Chats alternate left and right alignment so it looks a bit like a text conversation. For each chat I want to put metadata (reactions, sender, time) on the margin it is aligned to.

So For a left chat, on a left page, I want to use the left (outside) margin. A left chat right hand page the inside margin.

Two things I could not get sorted: first, perfect vertical alignment between the chat and metadata, ( I think this is possible but difficult) and a persnickety bug where the first chat on each page chooses the last page’s proper margin side.

Happy to pay for an answer - I did try to hire a typesetter for this as well.

Well, they are good in markdown and rust. Perhaps feeding some Typst documentation overview into the prompt could solve it?

An llm friendly language spec would help, I’d guess. FWIW it’s a common long tail language issue — everybody’s realll good at python and typescript, things fall off from there

It can be hard to write macros with state in typst.

It is hard to write macros in LaTeX.

Why not LyX or TeXmacs? Both seem to be better options than yet another markup language.

In addition to making it possible to write easily, TeXmacs is also based on a markup language. It demonstrates that a markup language and WYSIWYG writing can coexist efficiently.

Great work. Screenshots would be nice.

Why do CS doctoral candidates have such a fascination with typesetting? I mean, be into whatever you’re into, I guess.

But as soon as someone starts talking about LaTEX and how they spent months on their macros, I think “another hapless victim has fallen into LaTEX’s trap.” It’s like an ant lion that feeds on procrastinating students.

This is not limited to CS or Latex in any way. Plenty of students spend a lot of time fiddling with word, powerpoint, note taking systems, citation management (which is surprisingly horrible in MS word), Adobe software etc..

Obvious reasons:

- Your thesis is a major output of years of work. Of course you want it to look good.

- You might think it superficial, but if the presentation looks bad, many people (subconsciously) interpret this as a lack of care and attention. Just like an email with typos feels unprofessional even if the content is otherwise fine.

- Spending time on tooling feels productive even if it is not past a certain point.

- People that are into typesetting now have an excuse to spend time on it.

That said, in my experience people spent a few hours to learn "enough" latex several years ago and almost never write any macros. Simple reason: you work with other people and different journal templates, so the less custom code the better.

I was a math major in undergrad, we care about typesetting so much because you really do not want to be stuck handwriting everything, but it's not easy to be faster typing than you are with handwriting when you're writing out rows and rows of equations. (Actually physics was generally a lot harder for me to keep up with while typing than math was.)

And when your life is revolving around classes or your thesis, the #1 most important thing to you in the world is how easily you can transfer your ideas to paper/digital format. It makes a lot of sense that people care a lot about the quality of their typesetting engine and exchange macro tips with each other (I got a lot of helpful advice from friends, and my default latex header was about 50% my own stuff and 50% copied from friends in my same major)

On a total tangent, I found out that my grandfather's university digitized their entire library a few years ago including his masters' thesis from 1948. Back then it was written with a typewriter and by hand for everything else.

I bet he could have done something more advanced if he had modern computers, but looking at it 75 years later and seeing his handwriting on the page was moving more than the content itself.

Time spent on typesetting produces immediately visible results (however minor). Actual research doesn’t. It’s the classic feedback loop problem, so like you said, procrastinating students devote lots of time to largely pointless but seemingly productive activities like typesetting.

I was there once. In hindsight all the tweaks were a complete waste of time. All I needed was amsart, plus beamer for slides.

It's because LaTeX gives us a sense of legitimacy. (it's also why people go overboard with math notation in LaTeX documents, even when prose is more appropriate).

It produces documents that look like those produced by professors, and luminaries in the field. If you write equations in Word Equation Editor, your work just doesn't look very serious.

It's the same joy I felt when I laser-printed my first newsletter designed in Aldus PageMaker. I was only in my teens but I felt like a "professional".

I remember when I submitted a paper written in LaTeX to my math prof in college, alone in the class (nobody even mentioned it to us so it wasn't exactly surprising, but I was one of those guys running Gentoo as their desktop back then so...).

She not only instantly recognized it, but, judging by the look and the platitudes she gave me on the spot, it probably earned me an extra point on the overall grade.

When in Rome...

> If you write equations in Word Equation Editor, your work just doesn't look very serious.

Haven't tried it in a while, but, last I checked, Word Equation Editor output didn't look serious because it looked janky and look like it wasn't really done in a "professional" tool. Part of that is a self-fulfilling prophecy of course, LaTeX output looks right in part because it's what people have been reading for decades, but TeX's formulas just look plain good.

Last time I checked, Word was also basically untenable for math-heavy writing because there was too much procedure involved in setting a formula. This is fine if you need one here and there, but if you have lots of formulas (including many tiny ones, like just using the name of a variable), switching to a dedicated formula mode in the interface is just not pleasant. In LaTeX (or Typst), I just type $, and off I go.

Yet Word is leagues ahead of Google docs... (shudders)

There are add-ons for Gdocs. This is apparently pretty good. https://workspace.google.com/marketplace/app/autolatex_equat...

I don't know if this is still the case or not but equations in Word can be upgraded to MathType. IIRC the Word equations were a basic version of MathType (i.e. developed by the same people). MathType included latex syntax and much better layout and formatting. It was the only way to stay sane when working on journal articles with collaborators who gave less than zero interest in latex (i.e. physicians).

The equation editor in Word straight up supports LaTeX now days. It also supports UnicodeMath, which is an actual standard and a pretty cool one at that. Sadly it has almost no adoption outside of Word.

> If you write equations in Word Equation Editor

The experience is also awful. It's much better to write \in or \frac{}{} rather than to go to a dropdown menu and figure out which button to click.

You can use its own syntax for Word equation editor. They have even added Latex syntax support now. When was the last time you used Word. Latex support in equation editor has been there for ~5 years.

I did this once in undergrad. Used Word to make my term paper two columns and all formatted like a journal article. Felt cool. Felt legitimate. But I then felt kinda embarrassed and never really shared it with anyone.

Most universities don’t formally train their STEM students in technical writing. At the graduate level, one is basically at the mercy of one’s advisor’s taste, for better or (usually) for worse.

The first thing that my PhD advisor did, when I first met him as a foreign student, was to give me this book: https://archive.org/details/technicalwriting0000huck. And I am forever grateful for it.

For the record, at UIUC we had a bunch of seminar classes (and I think a regular class?) on LaTeX and technical document creation, ran by A.J. Hildebrand; it was a fantastic course and I learned a lot of folklore "secrets" that the manuals will not tell you, as well as technical writing tips that were far from obvious.

Having tutored CS undergrads on writing, the lack of training (or care, or perceived relevance) was painfully obvious. Many were semi-literate wrt to English prose.

That may be true in US universities, but in Europe students have to write technical reports in almost every course.

That’s a pretty sweeping generalization. In the European university that I went to, CS students definitely didn’t have to write anything longer than long-form exam questions until the bachelor’s thesis.

But less sweeping than the parent who generalized to "most universities". I think it was a long time since you went to university and times have changed.

[dead]

Not really for me in Poland - thesis was the only thing we had to write in a technical way.

There’s always the WordTex template if you want to create documents that look like LaTeX output from within Word: https://youtu.be/jlX_pThh7z8

> If you write equations in Word Equation Editor, your work just doesn't look very serious.

Sez you. MS Word 4.0 for Mac was perfectly alright, putting in less elbow grease than fiddling with LaTex.

And you could get a PDF out of it, via the PostScript print driver.

Never liked those spindly CM Tex fonts, anyway.

Given that LLMs can or soon will be able to turn markdown or word into LaTeX this filter won’t last long.

It’s a dumb filter anyway.

Markdown and Word don’t have the tools to express what LaTeX can. Not even your deity of choice will ever be able to turn the former into the latter, let alone an LLM.

A small, but important aspect of typesetting/WYSIWYM is the ability to break down a large document (like a thesis) into discrete sub-components. You could work on each section of your document in an individual .tex file and include it later in your top-level .tex file. This setup works well with VCS like git.

Another ergonomic benefit is scripting. For example, if I'm running a series of scripts to generate figures/plots, LaTeX will pick up on the new files (if the filename is unmodified) and update those figures after recompiling. This is preferable to scrolling through a large document in MS Word and attempting to update each figure individually.

As the size and figure count of your document increases, the ergonomics in MS Word degrade. The initial setup effort in LaTeX becomes minimal as this cost is "amortized" over the document.

> The initial setup effort in LaTeX becomes minimal as this cost is "amortized" over the document.

I'm still sour about the 3 days it took me to have something usable for my thesis, and I was starting from an existing template. And it's still not exactly how I want it to be; I gave up on addressing a bug in the reference list.

I wrote mine in Latex, along a team mate witting in Word. Her onboard was way faster, but she had to fight really hard in the end for Word not messing up everything on the smallest changes.

Meanwhile, when I had a decent setup I could move a whole section from the intro to the results and the overall layout didn't suffer (floating tables, figures and code still in place, references still pointing where they should). I had code snippets with colour highlights imported from the actual source code (good luck trying that in Word). I could insert the companion papers with a single line of code per document, and they looked great. I even had a compilation flag to output the ereader version.

My take was that Word enabled my team mate to kick a lot of cans down the road (but the cans eventually came back), while for me the reverse was true: build a decent foundation, and after that it was all pure write-cite-compile.

My school just had an official cls file, so my initial setup was just to download the template. So if that's where you're coming from (the journals I submitted to also had official templates), it's really minimal setup.

Another reason to use LaTeX for papers back in the day was that Microsoft Word would routinely corrupt large documents in terrifying ways. Sometimes the root of the corruption existed in the document somehow long before any of it was visible, so even recovering from an old backup would just lead to the problem repeating. I recall the only way to properly "recover" an old backup was to copy it all via plain text (e.g. Notepad), and then back into a brand new Word document.

This is all to say, if you're working on a theis or even a moderately large assignment, working in Word was not good for the nerves.

Looking back, I probably should have just worked in plain text and then worried about formatting only at the very end, but ummm, yes, I guess another hapless victim did indeed fall into LaTeX's trap. :)

I’ve run into this exact issue several times with group projects at university in the 2010s, and each time recovery was copying chunks of plain text from backup copies into new documents as you say. Luckily by the time we got to the final year capstone project the whole group was happy to go with LaTeX. Not sure if these Word issues have even been fixed since.

I don't have a source for this, so take it with a huge grain of salt... but for some reason I have a memory of someone telling me that the older versions of Word saved and loaded documents by writing the bytes of in-memory data structures directly to files on disk, with not much in the way of marshalling or validation in the middle. Because it was fast, or something. You can imagine the kind of edge cases and oopsies that might result.

The new versions at least serialise to some kind of monstrous XML representation of Word's internal state, so while it's not going to win any awards for world's most elegant document format, it should be slightly harder to corrupt in subtle ways.

I give 0 fs about typesetting. But typical mainstream software just cannot freaking process a 500 page document with tables, figures, references, equations etc. If Word/Pages/Openoffice/GoogleDocs could do it, no sane person would sink 100’s of hours in debugging latex out of memory errors.

But once you are in the latex world you start noticing how much prettier things can be. And then you end up sinking another thousand hours to perfectly aligning the summations in your multi-line equations.

From watching people write their thesis in both latex and word, I'd say if anything it is the other way around. The people who write their thesis in word (or another wysiwyg editor) spend more time on their layout than the people writing in latex. Worse, they spend the time while writing, while latex allows for separation of tasks, which allows people to get into the flow much more easily.

Sure, theoretically you can only concentrate on writing with word and ignore layout. In practice in takes a lot of discipline so instead you see people moving figures around putting spaces or returns to move a heading where they want to etc.. In particular as a way to procrastinate from actual writing.

> Worse, they spend the time while writing, while latex allows for separation of tasks,

I theory, yes. And that's also what I'm usually trying to do.

What I have observed though with Latex folks is that they type 3 words and then look at the preview or re-compile to see if it looks good.

I mean, as with code, the actual typing is not really the bottleneck.

I also basically read the right pane rendered output, but mostly as a "reading out what I've written and evaluating whether it sounds good" most of the time, not really messing with layouting (especially that LaTeX and Typst does that very well, I can be reasonably sure that my paragraphs will have decent hypens and such).

For me personally, I have yet to figure out how to get a word processor to have text be justified on both sides without inserting big gaps between words. I could use left justified but then the text ends up looking like a saw blade, which is still ugly.

Latex' handling of floating figures and tables is also much better.

And of course math notation is much nicer to work with in LaTeX (IMO).

In Word, set the paragraph alignment to justified and enable automatic hyphenation.

You can actually use LaTeX math notation in the equation editor in modern Word.

LaTeX typesetting is a solved problem. Memoir or Classic Thesis, paired with microtype, provide outstanding results and you need to spend zero time on tweaking stuff.

Typst is interesting, but it doesn't yet support all microtypography features provided by microtype. IMHO, those make a big difference.

I’m going to have to disagree with you there. The compile times are long, the error messages are worse than useless, and tikz diagrams are almost always unreadable messes.

Large swathes of mathematics, computer science, and physics involve notations and diagrams that are genuinely hard to typeset, and incredibly repetitive and hard to read if you don’t make heavy use of the macro system. Integrating some actual programming features could be a game changer.

> Integrating some actual programming features could be a game changer.

LuaTeX already lets you embed Lua code and it is really good.

However, I do agree some usability improvements are needed.

What in microtype makes "a big difference"? I don't recall using it (my LaTeX years are long behind me), but all of the examples on https://www.khirevich.com/latex/microtype/ seem incredibly minor. I don't think I'd notice any of them as the reader.

It will tweak spacing, kerning, margin protrusion, and font size to improve readability avoid big word gaps and excessive end-of-line hyphenation.

It is what sets professional typography apart. Only Adobe InDesign provides a comparable implementation, tweaking all those details.

See https://en.wikipedia.org/wiki/Hz-program for a better explanation and an example.

IMHO, the difference is obvious and not minor. Without microtypography texts look ugly: https://upload.wikimedia.org/wikipedia/commons/0/03/Hz_Progr...

> Only Adobe InDesign provides a comparable implementation, tweaking all those details.

TeXmacs claims to have implemented microtypography as well (https://www.texmacs.org/tmweb/home/news.en.html, as I am reading it, in the opening paragraph on version 2.1)

Sure, I don't like creeks like in your last example. But I absolutely prefer paragraphs, where the final line would be considered 'too short'. It also makes an appreciable impact for me, in how easy a text is to read.

Which is to say, half of these things are pretty subjective.

It depends what you're typesetting—if you're using letter/A4 paper with 1" margins, then you're unlikely to notice any difference; but if you're using narrow columns, then it will vastly reduce the number of paragraphs with ugly huge spaces between words. Margin kerning is the other big feature, but you probably won't notice that unless you're fairly picky.

Typst does already support some microtype fetures out of the box and more are coming: https://github.com/typst/typst/pull/6161

I'd also not overemphasize the significance of microtype features. They might help with narrow columns but on wider columnds the difference is very small and most people will never notice them at all.

I wrote my joint med-CS honours (1 year research thing we have in Aus) thesis in Word. My med supervisor was happy with it. CS supervised insisted I reformat it in LaTeX as he couldn't stand the typesetting.

Honestly I don't disagree with him, it looked far better in 'TeX. But that's probably a learnt preference.

In essence, it's culture.

Not all of us fell into that trap! My dissertation was written almost entirely using a default document class and a handful of packages, and only towards the end did I apply the university document style to come into compliance. I had more than enough to do on the subject of the PhD and didn’t have the patience to burn time on typesetting or fiddling with macros.

I’ve found in the decades since then that my most productive co-authors have been the ones who don’t think about typesetting and just use the basics. The ones who obsess over things like tikz or fancy macros for things like source layout and such: they get annoying fast.

Tikz is misplaced in this list; it is how you make any kind of vector drawings in LaTeX. It's not the only way, but perhaps the best documented and most expressive one. If you have any such drawings in your work, you won't get around putting some effort into it. Not comparable with boxed theorems or fancy headings.

Tikz is sometimes useful, but it can also be a massive time sucking pain in the butt.

I mean it is one of the few packages that can actually manage to annoy LaTeX fans, which is really saying something.

I think the annoyance with TikZ is twofold: (1) it tries to do a really hard thing (create a picture with text in a human writable way), (2) it is used infrequently enough that it’s hard to learn through occasional use.

That said, nobody makes you use TikZ, fire up Inkscape and do it wysiwyg.

I don't know about now but in 2000s anything even remotely math-related was PURE PAIN in Word-likes.

In my master's there were like 30 pages of formulas, all interdependent. Typing/retyping these would take forever.

Also, something as simple as having per-chapter files or working with an acceptable editor also helps.

>> Why do CS doctoral candidates have such a fascination with typesetting?

Probably because Donald Knuth created TeX and Leslie Lamport created LaTeX.

Two of the greatest minds in Computer Science created the tools and used them to write papers and articles that are beautiful.

Elegant ideas presented beautifully make reading and writing papers a nicer experience.

It seems the grammar of the language was an after thought... It amazes me that he spent so much time perfecting the laying out algorithms that he could not come up with a sane language.

Donald Knuth. Please.

Corrected. Thank you.

Autocorrect incorrected it for me.

>Autocorrect incorrected it for me.

I am saving this entire sequence for later use.

Because it's not particularly fun to edit a typo and have your layout completely messed up 10 pages later, which you have no chance of noticing unless a full review.

And publishing is the primary way academics communicate en large - it's kinda important to be able write your specific notation without resorting to drawing on paper.

First, you have to (or at least - you had to). I mean, it was the only way to sanely include a lot of formulae and managing bibliography.

Then you discover that is id beautiful. Honestly, even using base style sets you above the typesetting of books. With some extra tweaks, it is beautiful.

Did I spend a lot of time on LaTeX during my PhD. Sure! But (even counting in all masochism involved into dealing with LaTeX) I both cherish this time, and the results.

Why does the premier word processing software (Microsoft Word) care so little about typesetting?

I am biased however, as my thesis was written in LaTeX with all the plots regenerated at compile time from the raw data.

> Why does the premier word processing software (Microsoft Word) care so little about typesetting?

Because its target userbase is people who don’t give a single shit about typography.

Yet every day all across the world the director is handing the document to their team telling them to work the weekend cleaning it up and make it look good.

People who are fascinated with LaTeX are gearhead types. Just the same as photographers who care more about their cameras or chefs who care more about their knives.

Here it's typical that a thesis will be printed as a book, and it's that book that will be evaluated. For PhDs, there's a doctoral defence in front of a committee, peers and other interested parties and they're all given the book.

Usually the process for ordering books is that you send them a PDF with embedded fonts inside it, and it's made at the university's printing house. They will handle distribution etc. So you really, really want it to look right at the first go.

There's been some progress the past few years now where you get to preview the book somewhat, but one surefire way to get it right is to use something like LaTeX. It used to be one of few WYSIWYG solutions out there. And it used to be really hard to do certain required things in e.g. Word. For instance skipping some page numbering and doing others in roman numerals etc.

WYSIWYG means what you're editing looks like the end result; LaTex and Typst are at the opposite end of the scale, being languages that compile into layout. No, a preview window does not count as WYSIWYG.

> Why do CS doctoral candidates have such a fascination with typesetting?

Same reason wantrepreneurs have a fascination with adding dark mode to their CSS. It feels productive while you avoid the real work.

> Same reason wantrepreneurs have a fascination with adding dark mode to their CSS. It feels productive while you avoid the real work.

Accessibility is just as important as “the real work”.

I guess monks were procrastinating likewise when they illuminated their manuscripts.

When I was in college I found out that I have better reading experience reading LaTeX typsetted books. That's why I prefer to read Springer published reference books rather than class recommended books.

The typesetting is finished whenever you want it to be. I spend most of my time thinking about the content.

Well during 5 years of undergrad reports and papers, then 5 years of PhD thesis papers, you do tend to hoard some useful snippets, it is more of a byproduct than a fixation... at least for me.

It's like developers obsessing over their tools, so i get it.

> Why do CS doctoral candidates have such a fascination with typesetting?

Why does anyone care about typesetting? Probably because they spend a lot of time working with text and have therefore developed a level of taste.

Just because the bottom 80% of consumers have zero taste and will accept any slop you give them doesn't mean there isn't value in doing something only appreciated by the top 20%. In any field, not just typesetting. Most people have ~no refined endogenous preferences for food, art, music, etc.

I wonder if any doctoral defense has hinged on how refined the typesetting was. Probably. It’s the sort of ritual humiliation that academia specializes in.

Poor typesetting is like going to an interview in your underwear. While it may not directly reflect your skill, it says a lot about how much effort you like to put into things.

I'm not sure that it is as much about ritual humiliation as much as that, well, you are supposed to be at some sort of summit, so you must have refined your process.

A mountain hiker can wear whatever, but above a certain altitude something must be true of them (fit, trained well, holding various gear, has supplies, or is in a plane/heli and probably even better trained/equipped/fit).

I would hope that typesetting is just a qualia of an ordered mind not a goal of it.

You can choose to feel "humiliated", but the truth should be closer to that you may simply be inadequate in that regard.

I.e. it is not that using LaTeX (or even Typst) makes you a better person, just that certain types of people will tend to use tools, like mountain climbers likely use carabiners.

> I wonder if any doctoral defense has hinged on how refined the typesetting was.

At least 1 [0], but that's obviously a rather special case.

[0] https://tug.org/TUGboat/tb21-4/tb69thanh.pdf

I find it odd too. The fascination with typesetting limits the paper’s usability on narrower devices which seems a very strange position for engineers.

I have often thought that LaTEX' distinctive font and formatting is either a virtue signal or an in-group signal.

i switched all of our pdf generation to typst - fantastic software. love how efficient it is; it makes previewing trivial and iteration very fast.

Interesting! Did you use a tool to do the conversion automatically? How did it pick up on custom packages and styling?

I'm quite glad some alternatives are popping up. Using LaTeX feels like piece of 80s tech to be honest. It is obviously fine and super powerful, but, like vim-style fine. There got to be more contemporary alternatives that status quo.

Not everyone is into nostalgia. I don't try to take away LaTeX or vim from anyone, it just not for everyone.

These are some notes I wrote when I started out with typst when comparing with LaTeX:

1. It doesn't generate 5 bloody files when compiling.

2. Compiling is instant.

3. Diagnostics are way easier to understand (sort of like Rust compiler suggestion style).

4. List items can be either - item1 - item2, etc. or [item1], [item2]. The latter is way better because you can use anchoring to match on the braces (like "%" in vim), which means navigating long item entries is much easier.

5. In latex you have the \document{...} where you can't specify macros so they need to be at the top, in Typst you can specify the macros close to where you need them.

6. It's easier to version control and diff, especially if you use semantic line breaks.

7. Changing page layout, margins, spacing between things, etc., footers with page counters, etc. just seems way easier to do.

> 5. In latex you have the \document{...} where you can't specify macros so they need to be at the top, in Typst you can specify the macros close to where you need them.

You can define macros anywhere in a LaTeX document; it's packages that need to be loaded before \begin{document}.

> 6. It's easier to version control and diff, especially if you use semantic line breaks.

TeX mostly ignores whitespace, so semantic line breaks and version control should work equally well with both LaTeX and Typst.

(I agree with all your other points though)

I think what the GP means is that whitespace is often not ignored by LaTeX, so line breaks can cause extra wide spaces between words. It's common to comment out the line break in LaTeX for this reason. This is much less of an issue in Typst (if at all) due to the separation of code and content.

https://tex.stackexchange.com/questions/7453/what-is-the-use...

It's not even fine. It's old and it shows also in the functionality, and I say this as a rather heavy LaTeX user. For example, Unicode support is atrocious. A few years ago I had to include some Hebrew and Russian words in a document that was otherwise in Latin alphabet, and it was hell.

I'm not a vim user but my understanding is that it has native Unicode support. Software with old-school UI but adapted to current needs (or where needs just didn't change) is fine, but it's not the case of LaTeX.

XeTeX handles Unicode fine, but that's definitely one area where TeX shows its age and its extensibility didn't, I think, allow Lamport to make a real difference.

I have heard about it, but it isn't viable for me to switch to it because most academic journals and conferences have templates incompatible with XeTeX, or directly ask for the sources and compile them with pdflatex.

This is the same reason why it isn't viable for me to switch to typst either, by the way. I hope it gains popularity and ends up as a standard displacing (or along with) pdflatex.

I appreciate your postscript. I don't use TeX or vim out of nostalgia; I didn't discover TeX until I was a senior in undergrad, and I think I didn't discover vim until after I finished my Ph.D. I use vim because it seems best for its tasks, given the way I think (though maybe I think that way because I'm old). I use TeX because I write math for a living and have invested a huge amount of time using it, so that it's become intuitive to me even though I know it wouldn't be for a newcomer, and I can't be bothered to break long-established habits until I know an alternative will be established everywhere TeX is.

Well yeah but the point is that unlike other pieces of 70s tech, LaTeX has no suitable alternative in its class (at least until now :p), meaning a FOSS software to produce high quality typesetting with an emphasis on maths.

Pro tips: type long content unformatted, or barely formatted, then ask an LLM to format it using your markup of choice, then clean up the thing it got wrong.

They are very decent at inferring the context of stuff and will mark code, maths, titles so on farely decently. This lets you focus on the work of making it looks nice.

I'm always worried about LLMs unintentionally affecting the actual content, so the extra effort of carefully reviewing the diff just isn't worth it. Markdown + PanDoc is more sensible to me if your document is simple enough.

I successfully converted a typst report to md/mdx last week using this technique. For complex layout primitives I just told the llm to write a comment with a warning todo of the missing part it wasn't able to convert

Well you can always diff the document. If it's not too large, then manually inspect it. If too long, then pipe the diff into the clipboard and send it to another LLM to summarise the changes.

Jesus

Why not use javascript, JSX and TypScript to produce PDF? You use the language you know already.

Since when is TypeScript a typesetting engine?

Why would you want to implement an entire typesetting engine yourself?

AI is the primary audience for our writing, and the primary reason to reconsider our choice of markup format. It's all about semantic compression: Typst source, markdown, and asciidoc are far more concise than LaTeX source.

I'm observing, not here to convince anyone. The last six months of my life have been turned upside down, trying to discover the right touch for working with AI on topological research and code. It's hard to find good advice. Like surfing, the hardest part is all these people on the beach whining how the waves are kind of rough.

AI can actually read SVG math diagrams better than most people. AI doesn't like reading LaTeX source any more than I do.

I get the journal argument, but really? Some thawed-out-of-a-glacier journal editors still insist on two column formats, as if anyone still prints to paper. I'm old enough to not care. I'm thinking of publishing my work as a silent animation, and only later reluctantly releasing my AI prompts in the form of Typst documentation for the code.

> AI is the primary audience for our writing, and the primary reason to reconsider our choice of markup format.

That's AI which must adapt, not humans. If AI can't adapt then it can't be considered intelligent.

> Some thawed-out-of-a-glacier journal editors still insist on two column formats, as if anyone still prints to paper.

Narrow text is easier to read because you don't have to travel kilometres with your eyes. I purposely shrink width of the browser when reading longer texts.

- AI is the primary audience for our writing

I've been wondering about this a lot lately, specifically if there's a way to optimise my writing for AI to return specific elements when it's scraped/summarised (or whatever).

Printing is still not uncommon in professional scientific environments. When you actually have to read a paper, it turns out that actual paper is quite convenient.

The idea of publishing as an animation + Typst doc actually sounds pretty compelling… the old PDF format is starting to feel pretty stale

Personally, I write for humans.