There are probably good reasons for all of that, but it is just both bad DX and bad UX. It feels like you need to be a hardcore LaTeX expert or consult with one, in order to accomplish the most mundane things. Especially in a reliable way, that won’t break upon making seemingly unrelated changes, or won’t break other things itself.
I used Typst for a few weeks. It already feels much more understandable, consistent, hackable, and customizable. I guess that is the difference between an ad hoc macro system and an actually thought through programming language.
The only drawback I can see is the ecosystem being smaller and less mature. That is, however, counteracted by being able to do things on your own, without immersing yourself deeply in LaTeX for years. Also, it will improve with time.
LaTeX is great, don’t get me wrong. But its heritage and historical baggage is really dragging it down.
Posts/discussion I found interesting:
- http://www.goodmath.org/blog/2008/01/10/the-genius-of-donald...
- https://tex.stackexchange.com/q/24671
- https://news.ycombinator.com/item?id=15733381
In particular it's interesting how people seem to think TeX itself is actually quite nice to use but its popularity and LaTeX packages created a huge mess of a system.
Added to that, academics specifically are more willing to suffer old crufty stuff than software engineers tend to be. After all their job is to absorb fields of material whether good or bad, and the technology tends to be lagging behind the bleeding edge in many subfields anyway so TeX doesn't even necessarily stand out.
Bingo. Compared to troff and what preceded, TeX was amazing just in its usage. But its real value was in the quality of its typesetting. Knuth put a lot of effort into the beauty and historical correctness of the output, so much so that it was solving optimization problems to calculate line breaks. MS Word still can't break a line properly in 2026.
Further many docs from that era are plagued with abandonware.
TeX did one thing well for an era when often the only interface to the machine was over a Xyplex terminal server connecting to a tty at 9600 baud.
The backslash based syntax allows for some really powerful typesetting which is far above anything that exists today. At the same time, the use of backslash-based langauge right to the bottom in terms of macros is what is causing the frustration.
Typst kind of solves that by having backslash based syntax implemented in Rust.
another part is many people built their own solution to their own corner of this domain, and not all of them had the deep appreciation for how the rest of the TeX system works.
I hear similar complaints about "Make web page look good", which is popular but also a huge mess of a system.
I won't lie: It takes getting used to and you need to learn a lot if you want to achieve fancy complex typesetting effects. But - it's not half as inconvenient as it once was.
So whether the resulting file looks right depends on whether the rendering engine chooses the correct font. Looks like it's supposed to be Nimbus Sans or something metric compatible with that, but the serif font chosen by Typst looks obviously wrong.
But I think the main things it has going for it are that it: produces nice output, and all the journals accept it. Does there exist a tool that renders Typist to LaTeX? That could play nicely with the existing ecosystem.
It's worth noting that TeX was developed in the same time period that the details of lexical scope were being nailed down by Guy Steele in the Rabbit compiler for Scheme. It's not that TeX is an ad hoc system; it's more the case that people didn't actually know how to implement a better system at the time.
What is Tony famous for? Well, lots of things, including his very important comparison sort algorithm Quicksort, but, in this context how about the Billion Dollar Mistake ? That's a pretty nasty booboo in many programming languages for which Tony blames himself because it was his idea.
Like your parent said, TeX shipped a long time ago and we learned a lot since then, it is not a surprise that we know how to do better today, in fact it would be a serious black mark for Computer Science if we couldn't.
I don't worry too much about HTML output still being WIP. Even if TeX had a massive head start, Typst has a good development speed, and a little bit of slope makes up for a lot of y-intercept.
doesn't appear indifferent or hostile
* Higher priority work currently being done on ArXiV (moving from Perl to Python/cloud)
* No "standard" Typst distro
* Support team needs to be re-trained for a new language
* Persistency: TeX has 30+ years of history; will Typst be around in 30 years? Will current code compile? Will existing documents be supported?
That's why people take the math subset of latex and use it in other contexts - exactly like this product.
I can actually like write my own functions when I need to. I don't think I have ever written a LaTeX macro without having to look up a lot of stuff.
This seems like the _perfect_ use for an LLM: systematically porting over as much of the "ecosystem" to Typst as possible. Is anyone doing that?
It's like the JSX of Latex: markup in a programming language, not a programming language pretends to be markup.
> The only drawback I can see is the ecosystem being smaller and less mature. That is, however, counteracted by being able to do things on your own, without immersing yourself deeply in LaTeX for years. Also, it will improve with time.
Most "matches KaTeX" claims I've seen in the wild rely on screenshot eyeballing, which collapses on edge cases like spacing primes, integral subscripts, and matrix delimiters that scale.
One thing I'd be curious about: how are font fallbacks handled when the same Rust core ships to platforms with different system font availability?
KaTeX bundles fonts and assumes they load cleanly; CoreGraphics and Skia bring their own glyph caches and metrics.
Does the display list carry metric snapshots from the host text shaper, or does the core compute layout from a bundled metric file independent of the backend?
The webpage also does read like it was at least heavily LLM assisted, which makes it a bit hard to trust it.
That all said, this is definitely something I'd be interested in using for Zulip if is indeed going to be a well maintained open source project.
(We currently have a node server component that the Zulip server runs only the render LaTeX).
[1]: https://keenwrite.com/screenshots.html
Just thought I'd mention since it's related and I really like the project.
Is accessibility anywhere on the roadmap for RaTeX?
On a related note: is mathML more accessible than an AI generated text of how a human would read the mathematical or chemical formula?
Yes, screen readers would typically allow you to navigate the formulas in ways that are more sophisticated than text (not to mention the issues with translating to Braille, which I don't claim to understand, at all). In fact alternative text is a poor substitute for structured information about the formula, which is what you get with MathML.
Plus, the MathML + screen reader combo is deterministic and debuggable, as opposed to OCR'ing an image.
> There is katex-rs[0] that outputs html and mathML. I'd assume the two could work together and the mathML would be fed to whatever the screen reader receives instead of the image?
Maybe! You are parsing the input twice, but it could be a pragmatic solution. I don't know myself how native apps are supposed to expose MathML to screen readers (or if it is even possible without an embedded browser!).
That one jumped out to me too. The phrasing is so wiggly but technically correct it feels intentional. When I saw it I didn't blame it on the LLM, which is worse.
Otherwise it's a super cool looking project
After a bit of tinkering and understanding the idiosyncracies of Typst, the joy of having reliable, consistent, beautiful, data-driven resumes and cover letters is not measurable. It basically lifted any barrier to applications, while whatever I had before I had always considered a burden.
On top of that, I can add hiring process data directly to the yaml file to run further analysis.
Can LaTeX do this? Most probably, but the learning curve is the difference.
I guess it shows how everyone loves but hates LaTeX and is always trying to bolt on that one last thing that will make it good.
I guess you should mention how much is WASM, right?
Hello?
Trolling aside, I found this kind of Rust-powered typeset modernization promising. I used Typst and liked it. This one would have its own niche too.