Mathematics is hard for mathematicians to understand too
72 points
5 days ago
| 10 comments
| science.org
| HN
pathikrit
29 minutes ago
[-]
I love math but the symbology and notations get in my way. 2 ideas:

1. Can we reinvent notation and symbology? No superscripts or subscripts or greek letters and weird symbols? Just functions with input and output? Verifiable by type systems AND human readable

2. Also, make the symbology hyperlinked i.e. if it uses a theorem or axiom that's not on the paper - hyperlink to its proof and so on..

reply
zwnow
26 minutes ago
[-]
I'd love getting rid of all the weird symbols in favor of clear text functions or whatever. As someone who never learnt all the weird symbols its really preventing me from getting into math again... It is just not intuitive.
reply
MrDrDr
1 hour ago
[-]
I think this would be extremely valuable: “We need to focus far more energy on understanding and explaining the basic mental infrastructure of mathematics—with consequently less energy on the most recent results.” I’ve long thought that more of us could devout time to serious maths problems if they were written in a language we all understood.

A little off topic perhaps, but out of curiosity - how many of us here have an interest in recreational mathematics? [https://en.wikipedia.org/wiki/Recreational_mathematics]

reply
segfaultex
1 hour ago
[-]
Yeah, I don't want to be uncharitable, but I've noticed that a lot of stem fields make heavy use of esoteric language and syntax, and I suspect they do so as a means of gatekeeping.

I understand that some degree of formalism is required to enable the sharing of knowledge amongst people across a variety of languages, but sometimes I'll read a white paper and think "wow, this could be written a LOT more simply".

Statistics is a major culprit of this.

reply
locknitpicker
42 minutes ago
[-]
> Yeah, I don't want to be uncharitable, but I've noticed that a lot of stem fields make heavy use of esoteric language and syntax, and I suspect they do so as a means of gatekeeping.

I think you're confusing "I don't understand this" with "the man is keeping me down".

All fields develop specialized language and syntax because a) they handle specialized topics and words help communicate these specialized concepts in a concise and clear way, b) syntax is problem-specific for the same reason.

See for example tensor notation, or how some cultures have many specialized terms to refer to things like snow while communicating nuances.

> "wow, this could be written a LOT more simply"

That's fine. A big part of research is to digest findings. I mean, we still see things like novel proofs for the Pythagoras theorem. If you can express things clearer, why aren't you?

reply
zozbot234
18 minutes ago
[-]
Statistics is a weird special case where major subfields of applied statistics (including machine learning, but not only) sometimes retain wildly divergent terminology for the exact same concepts, for no good reason at all except the vagaries of historical development.
reply
gjulianm
58 minutes ago
[-]
> I suspect they do so as a means of gatekeeping

I'm surprised at how could you get at this conclusion. Formalisms, esoteric language and syntax are hard for everyone. Why would people invest in them if their only usefulness was gatekeeping? Specially when it's the same people who will publish their articles in the open for everyone to read.

A more reasonable interpretation is that those fields use those things you don't like because they're actually useful to them and to their main audience, and that if you want to actually understand those concepts they talk about, that syntax will end up being useful to you too. And that a lack of syntax would not make things easier to understand, just less precise.

reply
aleph_minus_one
1 hour ago
[-]
> I understand that some degree of formalism is required to enable the sharing of knowledge amongst people across a variety of languages, but sometimes I'll read a white paper and think "wow, this could be written a LOT more simply".

OK, challenge accepted: find a way to write one of the following papers much more simply:

Fabian Hebestreit, Peter Scholze; A note on higher almost ring theory

https://arxiv.org/abs/2409.01940

Peter Scholze; Berkovich Motives

https://arxiv.org/abs/2412.03382

---

What I want to tell you with these examples (these are, of course, papers which are far above my mathematical level) is: often what you read in math papers is insanely complicated; simplifying even one of such papers is often a huge academic achievement.

reply
beng-nl
40 minutes ago
[-]
My opinion on this is that in mathematics the material can be presented in a very dry and formal way, often in service of rigor, which is not welcoming at all, and is in fact unnecessarily unwelcoming.

But I don’t believe it to be used as gatekeeping at all. At worst, hazing (“it was difficult for me as newcomer so it should be difficult to newcomers after me”) or intellectual status (“look at this textbook I wrote that takes great intellectual effort to penetrate”). Neither of which should be lauded in modern times.

I’m not much of a mathematician, but I’ve read some new and old textbooks, and I get the impression there is a trend towards presenting the material in a more welcoming way, not necessarily to the detriment of rigor.

reply
zozbot234
24 minutes ago
[-]
The upside of a "dry and formal" presentation is that it removes any ambiguity about what exactly you're discussing, and how a given argument is supposed to flow. Some steps may be skipped, but at least the overall structure will be clear enough. None of that is guaranteed when dealing an "intuitive" presentation, especially when people tend to differ about what the "right" intuition of something ought to be. That can be even more frustrating, precisely when there's insufficient "dry and formal" rigor to pin everything down.
reply
TimPC
29 minutes ago
[-]
If it's actually in the service of rigor then it's not unnecessaryily unwelcoming. If it's only nominally in the service of rigor than maybe, but Mathematics absolutely needs extreme rigor.
reply
bncndn0956
1 hour ago
[-]
3blue1brown proves your point.

The saying, "What one fool can do, another can," is a motto from Silvanus P. Thompson's book Calculus Made Easy. It suggests that a task someone without great intelligence can accomplish must be relatively simple, implying that anyone can learn to do it if they put in the effort. The phrase is often used to encourage someone, demystify a complex subject, and downplay the difficulty of a task.

reply
jules
24 minutes ago
[-]
3blue1brown actually shows the usefulness of formalism. The videos are great, but by avoiding formalism, they are at least for me harder to understand than traditional sources. It is true that you need to get over the hump of understanding the formalism first, but that formalism is a very useful tool of thought. Consider algebraic notation with plus and times and so on. That makes things way easier to understand than writing out equations in words (as mathematicians used to do!). It is the same for more advanced formalisms.
reply
gjulianm
1 hour ago
[-]
3blue1brown, while they create great content, they do not go as deep into the mathematics, they avoid some of the harder to understand complexities and abstractions. Don't take me wrong, it's not a criticism of their content, it's just a different thing than what you'd study in a mathematics class.

Also, an additional thing is that videos are great are making people think they understand something when they actually don't.

reply
fragmede
8 minutes ago
[-]
In this modern era of easily accessible knowledge, how gate keepy is it though? It's inscrutable at first glance, but ChatGPT is more than happy to explain what the hell ℵ₀, ℵ₁, ♯, ♭, or Σ mean, and you can ask it to read the arxiv pdf and have it explain it to you.
reply
MangoToupe
50 minutes ago
[-]
> I suspect they do so as a means of gatekeeping.

What, as opposed to using ambiguous language and getting absolutely nothing done?

reply
bell-cot
1 hour ago
[-]
Gatekeeping, or self-promotion? You don't get investors/patents/promotions/tenure by making your knowledge or results sound simple and understandable.
reply
master-lincoln
1 hour ago
[-]
Is that really the case or are you just assuming so? Seems counter-intuitive to me.
reply
segfaultex
1 hour ago
[-]
Why not both? And that's a good point, there are a LOT of incentives to make things arbitrarily complex in a variety of fields.
reply
dr_dshiv
1 hour ago
[-]
See Brett Victor’s: Kill Math https://worrydream.com/KillMath/

He separates conceptual understanding from notational understanding— pointing out that the interface of using math has a major impact on utility and understanding. For instance, Roman numerals inhibit understanding and utilization of multiplication.

Better notational systems can be designed, he claims.

reply
Someone
1 hour ago
[-]
> I’ve long thought that more of us could devout time to serious maths problems if they were written in a language we all understood.

That assumes it’s the language that makes it hard to understand serious math problems. That’s partially true (and the reason why mathematicians keep inventing new language), but IMO the complexity of truly understanding large parts of mathematics is intrinsic, not dependent on terminology.

Yes, you can say “A monad is just a monoid in the category of endofunctors” in terms that more people know of, but it would take many pages, and that would make it hard to understand, too.

reply
zerofor_conduct
6 minutes ago
[-]
"The unknown thing to be known appeared to me as some stretch of earth or hard marl, resisting penetration... the sea advances insensibly in silence, nothing seems to happen, nothing moves, the water is so far off you hardly hear it... yet finally it surrounds the resistant substance."

A. Grothendieck

Understanding mathematical ideas often requires simply getting used to them

reply
borracciaBlu
2 hours ago
[-]
I was writing a small article about [Set, Set Builder Notation, and Set Comprehension](https://adropincalm.com/blog/set-set-builder-natatio-set-com...) and while i was investigating it surprised me how many different ways are to describe the same thing. Eg: see all the notation of a Set or a Tuple.

One last rant point is that you don't have "the manual" of math in the very same way you would go on your programming language man page and so there is no single source of truth.

Everybody assumes...

reply
BlackFingolfin
2 hours ago
[-]
I find it strange to compare "math" with one programming language. Mathematics is a huge and diverse field, with many subcommunities and hence also differing notation.

Your rant would be akin to this if the sides are reversed: "It's surprising how many different ways there are to describe the same thing. Eg: see all the notations for dictionaries (hash tables? associative arrays? maps?) or lists (vectors? arrays?).

You don't have "the manual" of programming languages. "

reply
segfaultex
1 hour ago
[-]
Not the original commenter, but I 100% agree that it's weird we have so many ways to describe dictionaries/hash tables/maps/etc. and lists.
reply
worthless-trash
1 hour ago
[-]
> You don't have "the manual" of programming languages. "

Well, we kinda do when you can say "this python program" the problem with a lot of math is that you can't even tell which manual to look up.

reply
nkrisc
51 minutes ago
[-]
Someone not educated in programming would not know that a given text is Python source code.
reply
mzl
1 hour ago
[-]
I wrote about overlapping intervals a while ago, and used what I thought was the standard math notation for closed and half-open intervals. From comments, I learned that half-open intervals are written differently in french mathematics: https://lobste.rs/s/cireck/how_check_for_overlapping_interva...
reply
voidhorse
3 minutes ago
[-]
As someone who has always struggled with mathematics at the calculational level, but who really enjoys theorems and proofs (abstract mathematics), here are some things that help me.

1. Study predicate logic, then study it again, and again, and again. The better and more ingrained predicate logic becomes in your brain the easier mathematics becomes.

2. Once you become comfortable with predicate logic, look into set theory and model theory and understand both of these well. Understand the precise definition of "theory" wrt to model theory. If you do this, you'll have learned the rules that unify nearly all of mathematics and you'll also understand how to "plug" models into theories to try and better understand them.

3. Close reading. If you've ever played magic the gathering, mathematics is the same thing--words are defined and used in the same way in which they are in games. You need to suspend all the temptation to read in meanings that aren't there. You need to read slowly. I've often only come upon a key insight about a particular object and an accurate understanding only after rereading a passage like 50 times. If the author didn't make a certain statement, they didn't make that statement, even if it seems "obvious" you need to follow the logical chain of reasoning to make sure.

4. Translate into natural english. A lot of math books will have whole sections of proofs and /or exercises with little to no corresponding natural language "explainer" of the symbolic statements. One thing that helps me tremendously is to try and frame any proof or theorem or collection of these in terms of the linguistic names for various definitions etc. and to try and summarize a body of proofs into helpful statements. For example "groups are all about inverses and how they allow us to "reverse" compositions of (associative) operations--this is the essence of "solvability"". This summary statement about groups helps set up a framing for me whenever I go and read a proof involving groups. The framing helps tremendously because it can serve as a foil too—i.e. if some surprising theorem contravene's the summary "oh, maybe groups aren't just about inversions" that allows for an intellectual development and expansion that I find more intuitive. I sometimes think of myself as a scientist examining a world of abstract creatures (the various models (individuals) of a particular theory (species))

5. Contextualize. Nearly all of mathematics grew out of certain lines of investigation, and often out of concrete technical needs. Understanding this history is a surprisingly effective way to make many initially mysterious aspects of a theory more obvious, more concrete, and more related to other bits of knowledge about the world, which really helps bolster understanding.

reply
ikyr9999
2 hours ago
[-]
Just the other day I was listening to EconTalk on this: https://www.econtalk.org/a-mind-blowing-way-of-looking-at-ma...
reply
MrDrDr
1 hour ago
[-]
Thank you for posting! - I was not aware of this.
reply
johngossman
2 hours ago
[-]
Mathematics is such an old field, older than anything except arguably philosophy, that it's too broad and deep for anyone to really understand everything. Even in graduate school I often took classes in things discovered by Gauss or Euler centuries before. A lot of the mathematical topics the HN crowd seems to like--things like the Collatz conjecture or Busy Beavers--are 60, 80 years old. So, you end up having to spend years specializing and then struggle to find other with the same background.

All of which is compounded by the desire to provide minimal "proofs from the book" and leave out the intuitions behind them.

reply
ekjhgkejhgk
2 hours ago
[-]
> A lot of the mathematical topics the HN crowd seems to like--things like the Collatz conjecture or Busy Beavers--are 60, 80 years old.

Do you know the reason for that? The reason is that those problems are open and easy to understand. For the rest of open problems, you need an expert to even understand the problem statement.

reply
adornKey
46 minutes ago
[-]
The desire to hide all traces where a proof comes from is really a problem and having more context would often be very helpful. I think some modern authors/teachers are nowadays getting good at giving more context. But mostly you have to be thankful that the people from the minimalist era (Bourbaki, ...) at least gave precise consistent definitions for basic terminology.

Mathematics is old, but a lot of basic terminology is surprisingly young. Nowadays everyone agrees what an abelian group is. But if you look into some old books from 1900 you can find authors that used the word abelian for something completely different (e.g. orthogonal groups).

Reading a book that uses "abelian" to mean "orthogonal" is confusing, at least until you finally understand what is going on.

reply
otoburb
37 minutes ago
[-]
>>[...] at least gave precise consistent definitions for basic terminology.

Hopefully interactive proof assistants like Lean or Rocq will help to mitigate at least this issue for anybody trying to learn a new (sub)field of mathematics.

reply
Davidzheng
2 hours ago
[-]
actually a lot of minimal proof expose more intuition than older proofs people find at first. I find it usually not extremely enlightening reading the first proofs of results, counterintuitively.
reply
bell-cot
1 hour ago
[-]
I'll argue for astronomy being the oldest. Minimal knowledge would help pre-humans navigate and keep track of the seasons. Birds are known to navigate by the stars.
reply
nkrisc
55 minutes ago
[-]
I would argue that some form of mathematics is necessary for astronomy, for “astronomy” as defined as anything more than simply recognizing and following stars.
reply
scotty79
2 hours ago
[-]
> Mathematics is such an old field, older than anything except arguably philosophy

If we are already venturing outside of scientific realm with philosophy, I'm sure fields of literature or politics are older. Especially since philosophy is just a subset of literature.

reply
saithound
2 hours ago
[-]
> I'm sure fields of literature or politics are older.

As far as anybody can tell, mathematics is way older than literature.

The oldest known proper accounting tokens are from 7000ish BCE, and show proper understanding of addition and multiplication.

The people who made the Ishango bone 25k years ago were probably aware of at least rudimentary addition.

The earliest writings are from the 3000s BCE, and are purely administrative. Literature, by definition, appeared later than writing.

reply
thaumasiotes
1 hour ago
[-]
> As far as anybody can tell, mathematics is way older than literature.

That depends what you mean by "literature". If you want it to be written down, then it's very recent because writing is very recent.

But it would be normal to consider cultural products to be literature regardless of whether they're written down. Writing is a medium of transmission. You wouldn't study the epic of Gilgamesh because it's written down. You study it to see what the Sumerians thought about the topics it covers, or to see which god some iconography that you found represents, or... anything that it might plausibly tell you. But the fact that it was written down is only the reason you can study it, not the reason you want to.

reply
mkl
1 hour ago
[-]
> That depends what you mean by "literature". If you want it to be written down

That is what literature means: https://en.wiktionary.org/wiki/literature#Noun

reply
pfortuny
1 hour ago
[-]
Well, then poetry is not literature.
reply
nkrisc
53 minutes ago
[-]
If it’s not written down, then that’s true.

Once someone writes it down, it is.

reply
threatofrain
56 minutes ago
[-]
Sure in the context that you mean it’s an oral tradition.
reply
thaumasiotes
1 hour ago
[-]
No, the argument is even dumber than that. The person who writes a poem hasn't created any literature.

The person who hears that poem in circulation and records it in his notes has created literature; an anthology is literature but an original work isn't.

reply
geomark
2 hours ago
[-]
I thought we were well past trying to understand mathematics. After all, John von Neumann long ago said "In mathematics we don't understand things. We just get used to them."
reply
ekidd
2 hours ago
[-]
Many ideas in math are extremely simple at heart. Some very precise definitions, maybe a clever theorem. The hard part is often: Why is this result important? How does this result generalize things I already knew? What are some concrete examples of this idea? Why are the definitions they way they are, and not something slightly different?

To use an example from functional programming, I could say:

- "A monad is basically a generalization of a parameterized container type that supports flatMap and newFromSingleValue."

- "A monad is a generalized list comprehension."

- Or, famously, "A monad is just a monoid in the category of endofunctors, what's the problem?"

The basic idea, once you get it, is trivial. But the context, the familiarity, the basic examples, and the relationships to other ideas take a while to sink in. And once they do, you ask "That's it?"

So the process of understanding monads usually isn't some sudden flash of insight, because there's barely anything there. It's more a situation where you work with the idea long enough and you see it in a few contexts, and all the connections become familiar.

(I have a long-term project to understand one of the basic things in category theory, "adjoint functors." I can read the definition just fine. But I need to find more examples that relate to things I already care about, and I need to learn why that particular abstraction is a particularly useful one. Someday, I presume I'll look at it and think, "Oh, yeah. That thing. It's why interesting things X, Y and Z are all the same thing under the hood." Everything else in category theory has been useful up until this point, so maybe this will be useful, too?)

reply
agumonkey
1 hour ago
[-]
It's probably a neurological artefact. When the brain just spent enough time looking at a pattern it can suddenly become obvious. You can go from blind to enlightened without the usual conscious logical effort. It's very odd.
reply
ekjhgkejhgk
2 hours ago
[-]
Just because someone said it doesn't mean we all agree with it, fortunately.

You know the meme with the normal distribution where the far right and the far left reach the same conclusion for different reasons, and the ones in the middle have a completely different opinion?

So on the far right you have people on von Neumann who says "In mathematics we don't understand things". On the far left you have people like you who say "me no mats". Then in the middle you have people like me, who say "maths is interesting, let me do something I enjoy".

reply
geomark
2 hours ago
[-]
Of course. I just find it hilarious that someone like von Neumann would say that.
reply
ekjhgkejhgk
2 hours ago
[-]
von Neumann liked saying things that he knew would have an effect like "so deep" and "he's so smart". Like when asked how he knew the answer, claiming that he did the sum in his head when undoutedly he knew the closed-form expression.
reply
srean
2 hours ago
[-]
I have tingling suspicion that you might have missed the joke.

To date I have not met anyone who thought he summed the terms of the infinite series in geometric series term by term. That would take infinite time. Of course he used the expression for the sum of a geometric series.

The joke is that he missed a clever solution that does not require setting up the series, recognising it's in geometric progression and then using the closed form.

The clever solution just finds the time needed for the trains to collide, then multiply that with the birds speed. No series needed.

reply
ekjhgkejhgk
1 hour ago
[-]
Ah. I was going by memory, and I had those two as separate stories. I didn't remember that he said "I did the sum" on the trains problem.
reply
Davidzheng
2 hours ago
[-]
sorry but that is a dumb quote.
reply
nyeah
47 minutes ago
[-]
Yeah, I wonder how exactly he meant that. I doubt that Von Neumann believed in random plug-and-chug, which is what I'd probably mean if I said I had given up on understanding something. Possibly von N was being very careful and cautious about what "understanding" means.

For example there's a story that von Neumann told Shannon to call his information metric entropy, telling S "nobody really understands entropy anyway." But if you've engaged with Shannon to the point of telling him that quantity seems to be the entropy, you really do understand something about entropy.

So maybe v N's worry was about really undertanding math concepts fully and extremely clearly. Going way beyond the point where I'd say "oh I get it!"

reply
isolli
1 hour ago
[-]
I recently came to realize the same things about physics. Even physicists find it hard to develop an intuitive mental picture of how space-time folds or what a photon is.
reply
fithisux
20 minutes ago
[-]
Mathematics is hard when there is not much time invested in processing the core idea.

For example, Dvoretzky-Rogers theorem in isolation is hard to understand.

While more applications of it appear While more generalizations of it appear While more alternative proofs of it appear

it gets more clear. So, it takes time for something to become digestible, but the effort spent gives the real insights.

Last but not least is the presentation of this theorem. Some authors are cryptic, others refactor the proof in discrete steps or find similarities with other proofs.

Yes it is hard but part of the work of the mathematician is to make it easier for the others.

Exactly like in code. There is a lower bound in hardness, but this is not an excuse to keep it harder than that.

reply