> mathematics only exists in a living community of mathematicians that spreads understanding and breaths life into ideas both old and new. The real satisfaction from mathematics is in learning from others and sharing with others. All of us have clear understanding of a few things and murky concepts of many more. There is no way to run out of ideas in need of clarification.
Yes! And this applies to all human culture, not just math. Everything people have figured out needs to be in living form to carried on. The more people the better. If math, or any product of human skill, is only recorded in papers or videos, that isn't the same as having millions of people understanding it in their own ways.
Modern culture often emphasizes innovation and fails to value mere maintenance, tradition, and upkeep. This can lead to people like the OP feeling that they have nothing to contribute, when actually, just learning math, being able to do it, being able to help others learn it - all of these are contributions.
We are all needed to keep civilization afloat, in ways we cannot anticipate. We all need to pursue some kind of excellence just to keep human culture alive.
This is why I think Brady Haran is one of the coolest living mathematicians. Numberphile is educating a new generation of young mathematicians for anyone with access to youtube. Accessible math communication is so important. So many cool things are buried in textbooks and papers the average person would never read.
I simply don't care to gatekeep what counts as education. It has taught me things from videos I can still recall a decade later and pushed me to explore different areas of math I wouldn't have done otherwise.
If topics are presented engagingly these efforts will no doubt inspire the next generation.
Young people are curious, sometimes all they need is a spark and to be introduced to a new topic in an engaging way. These forms of content deliver that spark.
Thank you for highlighting that answer. It is one of my favourite pieces of writing about the culture of mathematics. I just want to add that that particular answer is now affectionately known as Thurston's Paean.
> Everything people have figured out needs to be in living form to carried on.
It would appear that LLMs are invalidating this claim. Things can live in synthetic form and carry on just fine. Instead of cultivating a population of learned minds we are just feeding a few dozen egregores of models and training corpuses.
They are not invalidating this claim, and cannot, unless we'd actually try it out for a few generations. Which we shouldn't and won't.
LLMs are quite good at simulating life and living intelligence (in the short term), but they aren't any of that. That's why we call it artificial intelligence. It's true that we can't put our finger on what exactly the difference is, but it's not like reality has ever felt encumbered by our limited understanding.
Living culture is a concept that I think is quite unintuitive to modern minds. Examples of it are all around us... but it's usually blatantly missing from our "big picture" thinking.
For example. Take a modern country with a modern economy. Flatten it. Destroy all the factories. Bankrupt all the companies. You can get back to a fully modern economy again quite quickly. WWII demonstrates it.
Taking an unindustrialized country through the development process... that's very tricky. It can't really be rushed.
For a long time, economic development was seen as mostly capital and technology. You need time to develop all the capital needed. Roads, factories, etc. But... development efforts underperformed. Then the idea of "human capital" got popular as a way of explaining the deficit. Education, mostly. Development efforts still underperformed.
I think the "living community" thing is the answer to this. It' ecology. You can't make a rainforest by just dumping all the necessary organisms into the right climate. It's the endlessly complicated relationships between all those organisms that make the rainforest.
This is one of the things that worries me about the pace of modern change. When writing and literacy resurged in classical antiquity... we totally lost all the ways of (for example) doing scholarship orally. Socrates (through Plato) wrote about some of the downsides to this.
...and we did completely lose oral scholarship. We have no idea how to do it. Once the living culture died... it stayed dead. All the knowledge contained within it went away.
In theory, sure. In practice, our society is a) not set up to value things which don’t have an immediate financial ROI, b) is valuing them less as time progresses, not more, and c) is experiencing some very serious transitions that may destroy the financial viability of devoting a lot of your time and energy to some very important things.
So I've got a gut feeling that math (like human languages (like programming languages)) is best learned in service of some greater end.
I look at some truly impressive projects like CLASP which sprang into existence not because of someone noodling around, but because they had a bigger goal which required the team build it.
So my advice to any mathematician who feels lost, like they don't know what to work on, would be to go collaborate with someone who has an actual goal, to look for inspiration in the kinds of math they need.
Today, there are a lot of opportunities to jump forward that only get capitalized on through coincidence (e.g. two people bump into each other at a conference, or researcher happens to have a colleague working on a related problem through the lens of a different discipline). If AI does nothing but guarantee that everyone will have such a coincidence by serving as that expert from a different discipline, that will still be a massive driving force for progress.
The question of "whats a mathematician to do" is still clear: you need to find and curate and clearly express interesting and valuable problems.
Far from being motivated by some applications, the most useful discoveries in mathematics are usually discovered "for their own sake" and their application is only discovered later. Sometimes centuries later!
If so that seems like an opportunity for people who want to work on applied math? There’s a big backlog of techniques that so far have not been useful.
Parents reads as a comment on the usefulness of applying mathematics to problems in the world (applied mathematics) and discovering mathematical problems that push mathematics forward (pure mathematics) in the process. Pure mathematics is incredibly important, but I’d hardly count it as useful if we need to wait centuries.
Lots of fun counter examples to this. Complex numbers were introduced in the 1600s with no practical application for almost 300 years until they were used in electromagnetism and quantum mechanics.
> One can rewrite their books in modern language and notation or guide others to learn it too but I never believed this was the significant part of a mathematician work
There's yer problem right there. Good pedagogy is hard and highly undervalued. IMHO Grant Sanderson (a.k.a. 3blue1brown) is making some of the most significant contributions to math in all of human history by making very complex topics accessible to ordinary mortals. In so doing he addresses one of the most significant problems facing humankind: the growing gap between the technologically savvy and everyone else. That gap is the underlying cause of some very serious problems.
Big fan of him - but I also want to throw out the most obvious name in this space: Sal Khan
Hard to imagine now, but back when he started out, there were really no (to very few!) accessible math tutoring vids on the video platforms. Most of the times you had some universities, like MIT, putting out long-form vids from lectures - but actually having easily digestible 5 min vids like those Khan put out, just wasn't a thing.
I like to watch 3blue1brown too but I think it's a bit of an exaggeration to say his topics are accessible to normal folks. From my perspective I think it's more realistic to say he makes videos that shows you the beauty in math without having to understand it really. Which is valuable since most people get turned off on math because tiresome drills and tests hammered into them at school by people with zero interest in it.
Quite true. Real math needs practice and calculation to build the intuition and motivation towards the abstraction we want to construct. The video is more of a complementary material to the boring lectures (my prof uses 3b1b videos sometimes).
Indeed, pedagogy is important to staving off the end of mathematics.
That sounds dramatic, but it’s really obvious if you think about it. Right now, a person has to study for about 20 years (on average) to make novel contributions in mathematics. They have to learn what’s come before, the techniques, the results, etc. If mathematics continues, eventually it could take 25 years, or 30 years, or even a whole lifetime. At some point, most people will not be able to understand the work that’s been done in any subfield (or the work required to understand a subfield) in a human’s life. I claim this is the logical end of mathematics, at least as a human endeavor.
Now, there will be some results which refine other work and simplify results, but being able to teach a rapidly growing body of literature efficiently will be important to stave off the end of mathematics.
There's a Scott Alexandar story that plays with this exact topic: Ars Longa, Vita Brevis [1]
To your point, I think you're right. I'm not in mathematica, but the value of good pedagogy on shrinking the time it takes to get people to the forefront of any field feels like it's heavily overlooked.
Good pedagogy is a problem even for graduate-level mathematics students and professional mathematicians. The proofs in many graduate-level mathematics textbooks are, in my humble opinion, not really proofs at all. They are closer to high-level outlines of proofs. The authors simply do not show their work. The student then has to put in an extraordinary amount of effort to understand and justify each line. Sometimes a 10-line argument in a textbook might expand into a 10-page proof if the student really wants to convince themselves that the argument works.
I am not a mathematician, but out of personal interest, I have worked with professional mathematicians in the past to help refine notes that explain certain intermediate steps in textbooks (for example, Galois Theory, by Stewart, in a specific case). I was surprised to find that it was not just me who found the intermediate steps of certain proofs obscure. Even professional mathematicians who had studied the subject for much of their lives found them obscure. It took us two days of working together to untangle a complicated argument and present it in a way that satisfied three properties: (1) correctness, (2) completeness, and (3) accessibility to a reasonably motivated student.
And I don't mean that the books merely omit basic results from elementary topics like group theory or field theory, which students typically learn in their undergraduate courses. Even if we take all the elementary results from undergraduate courses for granted, the proofs presented in graduate-level textbooks are often nowhere near a complete explanation of why the arguments work. They are high-level outlines at best. I find this hugely problematic, especially because students often learn a topic under difficult deadlines. If the exposition does not include sufficient detail, some students might never learn exactly why the proof works, because not everyone has the time to work out a 10-page proof for every 10 lines in the book.
Many good universities provide accompanying notes that expand the difficult arguments by giving rigorous proofs and adding commentary to aid intuition. I think that is a great practice. I have studied several graduate-level textbooks in the last few years and while these textbooks are a boon to the world, because textbooks that expose the subject are better than no textbooks at all, I am also disappointed by how inaccessible such material often is. If I had unlimited time, I would write accompaniments to those textbooks that provide a detailed exposition of all the arguments. But of course, I don't have unlimited time. Even so, I am thinking of at least making a start by writing accompaniment notes for some topics whose exposition quality I feel strongly about, such as s-arc transitivity of graphs, field extensions and so on.
These days it's easy to just look for the details to any proof on mathlib. Of course a computer checked proof is not always super intuitive for a human, but most of the time it does work quite well.
I think it is like for a programmer to ask "How can one contribute to computer science?", while thinking about people like, Dijkstra, Knuth, or maybe even Carmack.
There are some geniuses who do groundbreaking work, but this wouldn't be of much use it it wasn't for the millions of people who do actual work with these theories (applied math), and teachers who train the next generation. In the academia, small discoveries exist too, these can be the stepping stones for the big things to come, even if they don't have a direct application now.
>I think it is like for a programmer to ask "How can one contribute to computer science?", while thinking about people like, Dijkstra, Knuth, or maybe even Carmack.
I had a conversation a day ago with a couple of high-school students who, were obviously smart (and a bit on the spectrum), but also lacking in broad knowledge, as one would expect from high-school students.
I think it revealed something about that sense of 'magic' or inaccessible talent. One of them mentioned the fast inverse square root function and marvelled about how even anyone could even come up with an idea like that because it seemed to him to be some transcendent feat to be able to realise casting binary floats into ints would be useful. They had looked at the function and couldn't fathom how something like that worked.
We had no computer to hand, but I asked him what 10^100 divided by 10^10 was. Smart kid that he was, answered instantly, correctly. I then asked him what operation he performed in his head to do that, and noted floating point exponents are just using 2 instead of 10. On the spot he figured out how the fast inverse square root worked. The magic vanished and it became accessible.
Some people lament the loss of 'magic' in this way, but I think the thing that makes it special, in this instance and in the universe in general, is that it still works, and it isn't magic. It's real and the fact that it be that without invoking some unaccessible property makes it even more special
I do think this has been less pressing of a question for programmers. For the longest time there was infinite work to do, no matter your depth—implementing business logic, making frameworks more general, making sure things fit into cache lines—that was accessible to us non-geniuses. Maybe in the LLM era this sort of t-crossing and i-dotting will go away.
Do the math because you enjoy doing the math and if you do it long enough you may well do something of value to someone else. Same goes for most intellectual and artistic pursuits I think.
I’ve learned for myself that as soon as enjoyment is based on some future achievement or ranking my work against others the day to day satisfaction dries up.
By having a job. If that job is the same as your intellectual/artistic pursuit then you have to balance the needs of satisfying your employer and what keeps your motivation going over the long term. All I’m saying is that worrying too much about future achievement or “great contributions” are a recipe for burnout and disappointment.
To add onto these answers, there were some notable "outsider art" additions to math around this time.
A few months before this post, Futurama contributed a new proof to the mathematical canon (for "The Prisoner of Benda"), resolving the conflict of the episode.
Almost a year after posting this, a 4chan user solved a previously-unsolved superpermutation (combinatorics) problem in a discussion about anime.
I think everyone who has thought about math seriously has felt similarly to the OP. It was impressed upon me early on that there are combinatorically (hah) many combinatorics problems to be solved and that these were just a few.
After reading another post about the most recent advances LLMs have made in finding and writing up novel, correct proofs, it sounds like the frontier models are now at the point of PhD student level. I wonder how a math student could contribute today, if they're just starting on the PhD track? Maybe by using LLMs as a mighty tool and providing skilled usage and oversight?
It must feel similar to those who wanted to become chess or go masters after computers surpassed humanity in those games.
> After reading another post about the most recent advances LLMs have made in finding and writing up novel, correct proofs, it sounds like the frontier models are now at the point of PhD student level.
This is somewhat misleading, the LLMs' contributions are in a limited niche of highly technical problem solving. They're neat but they're not the first mathematical theorem that gets automatically solved by a computer, that was done already in the 1990s.
> Maybe by using LLMs as a mighty tool and providing skilled usage and oversight?
Yes, even in the areas where LLMs are at their best, we'll still need a lot of human effort to make the results cleanly understandable. LLMs cannot do this well, even their generated papers have to be rewritten by human experts to surface the important bits.
> done already in the 1990s
by human-written programs that iterated through the finite casework that human thought had reduced the theorems to (four-colour theorem, FLT, etc.), which recent developments (eg. LLMs autonomously resolving Erdős problems) seem meaningfully distinct from.
> human effort to make the results cleanly understandable
well, perhaps loops of "derive proof through reasoning in English, formalise in Lean, use AST size of formal proof as a metric to optimise (via an LLM-guided search), translate back into English" could improve this? a lot of resources are being spent to make frontier LLMs more resistant to hallucinations via Lean, perhaps cogency will increase as a byproduct.
The can't predict the consequences of an action predicting one token after another. They can't solve a Rubik's Cube unlike a 7 year old human who can learn to do it in a weekend. They can't imagine the perspective of being a human being unlike a 7 year old human if asked to imagine they where in the position of another human.
I wonder if AI is one means to overcome the natural limits of human knowledge aggregation [0].
On the other hand, in the very long run, what does it mean if a talented human being does not have enough years of life to fully analyze and understand an extremely advanced proof created by AI?
Yes, but you (as a human) can still understand the cathedral (the building). This is not guaranteed for advanced AI work in mathematics in the future. If so, are we/they are really still adding to human knowledge, at this stage?
If your motivation is being recognized as the best of the best, winning the competition, yes it’s probably a bleak world. But if you motivation is improving your own capabilities, with the metric being if you’re better know then you were last month, then it’s not a bleak world, there are many more tools available to help you learn and improve now then there were in the past.
The Mathoverflow question was asked 15 years ago. The top answer says that the human community part is very important and spreading knowledge an critical thinking is valuable.
The most recent advances are stunts by a handful of famous prompters who are funded in various ways by the LLM industrial complex.
How many theorems are proven by mathematicians each year? Let's guess 10000. Then the Erdos toy proofs with unknown token and resource usage are less than 1%.
a) Individualized teaching methodology. We come with different backgrounds, therefore different types of analogies/examples, different levels of background material, different (but systematized) levels of presentation should be used. The same ask should be applied to kids learning through starting at preschool.
b) Readable mathematics papers where the compact notations are abandoned, and narrative, visualizations are introduced, while preciseness is maintained. It is possible that the same paper (or chapter or topic) should be renderable in multiple ways (for professional mathematicians in the field, for a casual reader, for a student, for an individual reader (as for (a) )
c) Mathematical logic / tooling for differentiable data/event computing. Where there are mathematical tools as well as CS implementation of this tools that allow to act on a difference in state, data, actions.
Typical mathematics (with exception of may be time series), does not view time as 'first class citizen' so to speak, be it abstract algebra and category theory or something else. But, I think, when we go to the 'applied world' we must introduce 'time dimension' as first class citizen. So having the mathematical machinery dealing with this dimension in organic way across many of the areas of mathematics -- will be beneficial to the application of this one of the most valuable human tools.
this guy is resigned to feel worthless compared to other mathematicians (suggesting he become cannon fodder in some type of mathematical sacrifice. i wonder if that analogy even makes sense in the field xD).
but, he desperately wants to become a great mathematician who creates completely original work.
from my experience, people tend to or even want to limit themselves. they think they know the ceiling of their capabilities and it becomes some self fulfilling prophecy.
if you really care about doing something great like this guy does, don't limit yourself. push until you achieve the greatness you want to achieve.
it's like that one saying, aim for the stars and you might land on a cloud. you will be surprised at how capable you actually are
I think the answer is to do multi-disciplinary work.
Venture outside of pure theoretical math. Learn some other domain knowledge and combine it with your mathematical ommph. That's the easiest way to make an impact now rather than potentially decades later.
If we see our contributions as brownian motion rather than preconceived trajectories, then, rather than focusing on the Gausses, Einsteins, Patons as providing singular progress, they become the the dominant least energy paths to what we recognize as truth. Without negating the individual’s contribution, the ones we see as truly important are the ones that supported by every other’s attempt, finds the path forward.
This should provide hope, if we can leave aside our egos and focus on humanity, we can, and do, all contribute even though a few seems to get all the credit.
This also goes for AI, it may be an accelerant in research, but the probability distribution of reality is large, large enough for humans to wonder, ask questions and stumble upon a new path forward, that computers alone don’t find.
Today, even understanding what new mathematics is being done in a particular zone of the mathematical universe appears to require a four-year graduate program of constant study just to be able to follow some mathematician’s original work - and that’s only going to give you a window into a rather narrow subsection of mathematics. The days when people of great talent like Euler and Gauss could contribute to many areas of mathematics are long gone.
But mere mortals can still derive great satisfaction from following along in the footsteps of past pioneers, possibly adapting their work to new problems in a minor way, or just creating educational visualizations and tools that help other people understand things like Galois theory, Poincare phase space or Markov chains, which can be applied to quantum mechanics, orbital dynamics, or protein sequence analysis. That’s valuable, even if no Fields Medals will be coming your way.
For the core discipline, though, I’d mostly worry about lack of opportunities for serious mathematicians to practice their craft in the USA due to the trends of academic budget cuts, anti-intellectual rhetoric, insistence on profit generation as the only rationale for doing anything, etc. Looks a bit 1930s Germany to me, at least here in the USA.
From one of the answers:
> mathematics only exists in a living community of mathematicians that spreads understanding and breaths life into ideas both old and new. The real satisfaction from mathematics is in learning from others and sharing with others. All of us have clear understanding of a few things and murky concepts of many more. There is no way to run out of ideas in need of clarification.
Yes! And this applies to all human culture, not just math. Everything people have figured out needs to be in living form to carried on. The more people the better. If math, or any product of human skill, is only recorded in papers or videos, that isn't the same as having millions of people understanding it in their own ways.
Modern culture often emphasizes innovation and fails to value mere maintenance, tradition, and upkeep. This can lead to people like the OP feeling that they have nothing to contribute, when actually, just learning math, being able to do it, being able to help others learn it - all of these are contributions.
We are all needed to keep civilization afloat, in ways we cannot anticipate. We all need to pursue some kind of excellence just to keep human culture alive.
This is why I think Brady Haran is one of the coolest living mathematicians. Numberphile is educating a new generation of young mathematicians for anyone with access to youtube. Accessible math communication is so important. So many cool things are buried in textbooks and papers the average person would never read.
Numberphile doesn't do any education. That's like saying the Discovery Channel is educating a new generation of zoologists.
I simply don't care to gatekeep what counts as education. It has taught me things from videos I can still recall a decade later and pushed me to explore different areas of math I wouldn't have done otherwise.
It's education for whoever finds it educational
If topics are presented engagingly these efforts will no doubt inspire the next generation.
Young people are curious, sometimes all they need is a spark and to be introduced to a new topic in an engaging way. These forms of content deliver that spark.
> From one of the answers [...]
Thank you for highlighting that answer. It is one of my favourite pieces of writing about the culture of mathematics. I just want to add that that particular answer is now affectionately known as Thurston's Paean.
> Everything people have figured out needs to be in living form to carried on.
It would appear that LLMs are invalidating this claim. Things can live in synthetic form and carry on just fine. Instead of cultivating a population of learned minds we are just feeding a few dozen egregores of models and training corpuses.
They are not invalidating this claim, and cannot, unless we'd actually try it out for a few generations. Which we shouldn't and won't.
LLMs are quite good at simulating life and living intelligence (in the short term), but they aren't any of that. That's why we call it artificial intelligence. It's true that we can't put our finger on what exactly the difference is, but it's not like reality has ever felt encumbered by our limited understanding.
All LLMs do is launder other people's IP. So I don't think you invalidated any other claim.
Living culture is a concept that I think is quite unintuitive to modern minds. Examples of it are all around us... but it's usually blatantly missing from our "big picture" thinking.
For example. Take a modern country with a modern economy. Flatten it. Destroy all the factories. Bankrupt all the companies. You can get back to a fully modern economy again quite quickly. WWII demonstrates it.
Taking an unindustrialized country through the development process... that's very tricky. It can't really be rushed.
For a long time, economic development was seen as mostly capital and technology. You need time to develop all the capital needed. Roads, factories, etc. But... development efforts underperformed. Then the idea of "human capital" got popular as a way of explaining the deficit. Education, mostly. Development efforts still underperformed.
I think the "living community" thing is the answer to this. It' ecology. You can't make a rainforest by just dumping all the necessary organisms into the right climate. It's the endlessly complicated relationships between all those organisms that make the rainforest.
This is one of the things that worries me about the pace of modern change. When writing and literacy resurged in classical antiquity... we totally lost all the ways of (for example) doing scholarship orally. Socrates (through Plato) wrote about some of the downsides to this.
...and we did completely lose oral scholarship. We have no idea how to do it. Once the living culture died... it stayed dead. All the knowledge contained within it went away.
In theory, sure. In practice, our society is a) not set up to value things which don’t have an immediate financial ROI, b) is valuing them less as time progresses, not more, and c) is experiencing some very serious transitions that may destroy the financial viability of devoting a lot of your time and energy to some very important things.
So I've got a gut feeling that math (like human languages (like programming languages)) is best learned in service of some greater end.
I look at some truly impressive projects like CLASP which sprang into existence not because of someone noodling around, but because they had a bigger goal which required the team build it.
So my advice to any mathematician who feels lost, like they don't know what to work on, would be to go collaborate with someone who has an actual goal, to look for inspiration in the kinds of math they need.
Today, there are a lot of opportunities to jump forward that only get capitalized on through coincidence (e.g. two people bump into each other at a conference, or researcher happens to have a colleague working on a related problem through the lens of a different discipline). If AI does nothing but guarantee that everyone will have such a coincidence by serving as that expert from a different discipline, that will still be a massive driving force for progress.
The question of "whats a mathematician to do" is still clear: you need to find and curate and clearly express interesting and valuable problems.
It's a delightful counterintuition that your gut feeling is mostly wrong: https://webhomes.maths.ed.ac.uk/~v1ranick/papers/wigner.pdf
Far from being motivated by some applications, the most useful discoveries in mathematics are usually discovered "for their own sake" and their application is only discovered later. Sometimes centuries later!
If so that seems like an opportunity for people who want to work on applied math? There’s a big backlog of techniques that so far have not been useful.
Parents reads as a comment on the usefulness of applying mathematics to problems in the world (applied mathematics) and discovering mathematical problems that push mathematics forward (pure mathematics) in the process. Pure mathematics is incredibly important, but I’d hardly count it as useful if we need to wait centuries.
>are usually discovered "for their own sake"
Like prime numbers? (used in cryptography)
Lots of fun counter examples to this. Complex numbers were introduced in the 1600s with no practical application for almost 300 years until they were used in electromagnetism and quantum mechanics.
> One can rewrite their books in modern language and notation or guide others to learn it too but I never believed this was the significant part of a mathematician work
There's yer problem right there. Good pedagogy is hard and highly undervalued. IMHO Grant Sanderson (a.k.a. 3blue1brown) is making some of the most significant contributions to math in all of human history by making very complex topics accessible to ordinary mortals. In so doing he addresses one of the most significant problems facing humankind: the growing gap between the technologically savvy and everyone else. That gap is the underlying cause of some very serious problems.
Big fan of him - but I also want to throw out the most obvious name in this space: Sal Khan
Hard to imagine now, but back when he started out, there were really no (to very few!) accessible math tutoring vids on the video platforms. Most of the times you had some universities, like MIT, putting out long-form vids from lectures - but actually having easily digestible 5 min vids like those Khan put out, just wasn't a thing.
I like to watch 3blue1brown too but I think it's a bit of an exaggeration to say his topics are accessible to normal folks. From my perspective I think it's more realistic to say he makes videos that shows you the beauty in math without having to understand it really. Which is valuable since most people get turned off on math because tiresome drills and tests hammered into them at school by people with zero interest in it.
Quite true. Real math needs practice and calculation to build the intuition and motivation towards the abstraction we want to construct. The video is more of a complementary material to the boring lectures (my prof uses 3b1b videos sometimes).
Indeed, pedagogy is important to staving off the end of mathematics.
That sounds dramatic, but it’s really obvious if you think about it. Right now, a person has to study for about 20 years (on average) to make novel contributions in mathematics. They have to learn what’s come before, the techniques, the results, etc. If mathematics continues, eventually it could take 25 years, or 30 years, or even a whole lifetime. At some point, most people will not be able to understand the work that’s been done in any subfield (or the work required to understand a subfield) in a human’s life. I claim this is the logical end of mathematics, at least as a human endeavor.
Now, there will be some results which refine other work and simplify results, but being able to teach a rapidly growing body of literature efficiently will be important to stave off the end of mathematics.
There's a Scott Alexandar story that plays with this exact topic: Ars Longa, Vita Brevis [1]
To your point, I think you're right. I'm not in mathematica, but the value of good pedagogy on shrinking the time it takes to get people to the forefront of any field feels like it's heavily overlooked.
https://slatestarcodex.com/2017/11/09/ars-longa-vita-brevis/
Good pedagogy is a problem even for graduate-level mathematics students and professional mathematicians. The proofs in many graduate-level mathematics textbooks are, in my humble opinion, not really proofs at all. They are closer to high-level outlines of proofs. The authors simply do not show their work. The student then has to put in an extraordinary amount of effort to understand and justify each line. Sometimes a 10-line argument in a textbook might expand into a 10-page proof if the student really wants to convince themselves that the argument works.
I am not a mathematician, but out of personal interest, I have worked with professional mathematicians in the past to help refine notes that explain certain intermediate steps in textbooks (for example, Galois Theory, by Stewart, in a specific case). I was surprised to find that it was not just me who found the intermediate steps of certain proofs obscure. Even professional mathematicians who had studied the subject for much of their lives found them obscure. It took us two days of working together to untangle a complicated argument and present it in a way that satisfied three properties: (1) correctness, (2) completeness, and (3) accessibility to a reasonably motivated student.
And I don't mean that the books merely omit basic results from elementary topics like group theory or field theory, which students typically learn in their undergraduate courses. Even if we take all the elementary results from undergraduate courses for granted, the proofs presented in graduate-level textbooks are often nowhere near a complete explanation of why the arguments work. They are high-level outlines at best. I find this hugely problematic, especially because students often learn a topic under difficult deadlines. If the exposition does not include sufficient detail, some students might never learn exactly why the proof works, because not everyone has the time to work out a 10-page proof for every 10 lines in the book.
Many good universities provide accompanying notes that expand the difficult arguments by giving rigorous proofs and adding commentary to aid intuition. I think that is a great practice. I have studied several graduate-level textbooks in the last few years and while these textbooks are a boon to the world, because textbooks that expose the subject are better than no textbooks at all, I am also disappointed by how inaccessible such material often is. If I had unlimited time, I would write accompaniments to those textbooks that provide a detailed exposition of all the arguments. But of course, I don't have unlimited time. Even so, I am thinking of at least making a start by writing accompaniment notes for some topics whose exposition quality I feel strongly about, such as s-arc transitivity of graphs, field extensions and so on.
These days it's easy to just look for the details to any proof on mathlib. Of course a computer checked proof is not always super intuitive for a human, but most of the time it does work quite well.
Seconded. Good pedagogy is like fertilizing the soil- it creates conditions conducive to learning in order to do good research.
I think it is like for a programmer to ask "How can one contribute to computer science?", while thinking about people like, Dijkstra, Knuth, or maybe even Carmack.
There are some geniuses who do groundbreaking work, but this wouldn't be of much use it it wasn't for the millions of people who do actual work with these theories (applied math), and teachers who train the next generation. In the academia, small discoveries exist too, these can be the stepping stones for the big things to come, even if they don't have a direct application now.
>I think it is like for a programmer to ask "How can one contribute to computer science?", while thinking about people like, Dijkstra, Knuth, or maybe even Carmack.
I had a conversation a day ago with a couple of high-school students who, were obviously smart (and a bit on the spectrum), but also lacking in broad knowledge, as one would expect from high-school students.
I think it revealed something about that sense of 'magic' or inaccessible talent. One of them mentioned the fast inverse square root function and marvelled about how even anyone could even come up with an idea like that because it seemed to him to be some transcendent feat to be able to realise casting binary floats into ints would be useful. They had looked at the function and couldn't fathom how something like that worked.
We had no computer to hand, but I asked him what 10^100 divided by 10^10 was. Smart kid that he was, answered instantly, correctly. I then asked him what operation he performed in his head to do that, and noted floating point exponents are just using 2 instead of 10. On the spot he figured out how the fast inverse square root worked. The magic vanished and it became accessible. Some people lament the loss of 'magic' in this way, but I think the thing that makes it special, in this instance and in the universe in general, is that it still works, and it isn't magic. It's real and the fact that it be that without invoking some unaccessible property makes it even more special
I do think this has been less pressing of a question for programmers. For the longest time there was infinite work to do, no matter your depth—implementing business logic, making frameworks more general, making sure things fit into cache lines—that was accessible to us non-geniuses. Maybe in the LLM era this sort of t-crossing and i-dotting will go away.
> Maybe in the LLM era this sort of t-crossing and i-dotting will go away.
On the contrary, prompting LLMs creates a whole lot of newly accessible basic work.
“Comparison is the thief of joy.”
Do the math because you enjoy doing the math and if you do it long enough you may well do something of value to someone else. Same goes for most intellectual and artistic pursuits I think.
I’ve learned for myself that as soon as enjoyment is based on some future achievement or ranking my work against others the day to day satisfaction dries up.
and so how exactly are you supposed to provide for your family in this scenario?
By having a job. If that job is the same as your intellectual/artistic pursuit then you have to balance the needs of satisfying your employer and what keeps your motivation going over the long term. All I’m saying is that worrying too much about future achievement or “great contributions” are a recipe for burnout and disappointment.
Could work in a patent office or something.
Related. Others?
Bill Thurston's answer to “What's a mathematician to do?” (2010) - https://news.ycombinator.com/item?id=23461983 - June 2020 (21 comments)
Bill Thurston answers: What's a mathematician to do? - https://news.ycombinator.com/item?id=15578866 - Oct 2017 (25 comments)
What's a Mathematician to do? - https://news.ycombinator.com/item?id=8265509 - Sept 2014 (44 comments)
Bill Thurston's answer to "What's a mathematician to do?" - https://news.ycombinator.com/item?id=4419859 - Aug 2012 (1 comment)
Edit: bonus relateds:
https://news.ycombinator.com/item?id=43345503 (March 2025)
It's not mathematics that you need to contribute to (2010) - https://news.ycombinator.com/item?id=36744690 - July 2023 (65 comments)
Knots to Narnia – Bill Thurston (1992) [video] - https://news.ycombinator.com/item?id=34426275 - Jan 2023 (8 comments)
On Proof and Progress in Mathematics (1994) - https://news.ycombinator.com/item?id=31960487 - July 2022 (1 comment)
On Proof and Progress in Mathematics (1994) [pdf] - https://news.ycombinator.com/item?id=12280139 - Aug 2016 (8 comments)
Bill Thurston has died - https://news.ycombinator.com/item?id=4419566 - Aug 2012 (18 comments)
On Proof And Progress In Mathematics (1994) [pdf] - https://news.ycombinator.com/item?id=2582730 - May 2011 (1 comment)
On proof and progress in mathematics (1994) - https://news.ycombinator.com/item?id=982335 - Dec 2009 (5 comments)
It's not mathematics that you need to contribute to (2010) - https://news.ycombinator.com/item?id=36744690 - July 2023 (65 comments)
Inserted. Thanks!
To add onto these answers, there were some notable "outsider art" additions to math around this time.
A few months before this post, Futurama contributed a new proof to the mathematical canon (for "The Prisoner of Benda"), resolving the conflict of the episode.
Almost a year after posting this, a 4chan user solved a previously-unsolved superpermutation (combinatorics) problem in a discussion about anime.
I think everyone who has thought about math seriously has felt similarly to the OP. It was impressed upon me early on that there are combinatorically (hah) many combinatorics problems to be solved and that these were just a few.
After reading another post about the most recent advances LLMs have made in finding and writing up novel, correct proofs, it sounds like the frontier models are now at the point of PhD student level. I wonder how a math student could contribute today, if they're just starting on the PhD track? Maybe by using LLMs as a mighty tool and providing skilled usage and oversight?
It must feel similar to those who wanted to become chess or go masters after computers surpassed humanity in those games.
> After reading another post about the most recent advances LLMs have made in finding and writing up novel, correct proofs, it sounds like the frontier models are now at the point of PhD student level.
This is somewhat misleading, the LLMs' contributions are in a limited niche of highly technical problem solving. They're neat but they're not the first mathematical theorem that gets automatically solved by a computer, that was done already in the 1990s.
> Maybe by using LLMs as a mighty tool and providing skilled usage and oversight?
Yes, even in the areas where LLMs are at their best, we'll still need a lot of human effort to make the results cleanly understandable. LLMs cannot do this well, even their generated papers have to be rewritten by human experts to surface the important bits.
> done already in the 1990s by human-written programs that iterated through the finite casework that human thought had reduced the theorems to (four-colour theorem, FLT, etc.), which recent developments (eg. LLMs autonomously resolving Erdős problems) seem meaningfully distinct from. > human effort to make the results cleanly understandable well, perhaps loops of "derive proof through reasoning in English, formalise in Lean, use AST size of formal proof as a metric to optimise (via an LLM-guided search), translate back into English" could improve this? a lot of resources are being spent to make frontier LLMs more resistant to hallucinations via Lean, perhaps cogency will increase as a byproduct.
LLM models can only predict the next token.
The can't predict the consequences of an action predicting one token after another. They can't solve a Rubik's Cube unlike a 7 year old human who can learn to do it in a weekend. They can't imagine the perspective of being a human being unlike a 7 year old human if asked to imagine they where in the position of another human.
Those are very strong claims, do you really believe an LLM can't be trained to solve Rubik's Cubes?
Can you imagine what if feels like to be a LLM?
Can one LLM have a better sensation of what it feels like to be a different LLM (say one that score a little better?)?
You design circularly defined criteria...
honestly I'm pretty sure opus could solve a rubiks cube if you just gave it the layout if the sides and looped until it would solve it
or even just take a picture of the thing, since they can digest visual input now
I wonder if AI is one means to overcome the natural limits of human knowledge aggregation [0].
On the other hand, in the very long run, what does it mean if a talented human being does not have enough years of life to fully analyze and understand an extremely advanced proof created by AI?
[0]: https://slatestarcodex.com/2017/11/09/ars-longa-vita-brevis/
Perhaps it will become like those cathedrals that took centuries and many generations of humans to build.
Yes, but you (as a human) can still understand the cathedral (the building). This is not guaranteed for advanced AI work in mathematics in the future. If so, are we/they are really still adding to human knowledge, at this stage?
Mathematics as an aggregate already is that cathedral. It is grander and more beautiful than any earthly cathedral.
If your motivation is being recognized as the best of the best, winning the competition, yes it’s probably a bleak world. But if you motivation is improving your own capabilities, with the metric being if you’re better know then you were last month, then it’s not a bleak world, there are many more tools available to help you learn and improve now then there were in the past.
The Mathoverflow question was asked 15 years ago. The top answer says that the human community part is very important and spreading knowledge an critical thinking is valuable.
The most recent advances are stunts by a handful of famous prompters who are funded in various ways by the LLM industrial complex.
How many theorems are proven by mathematicians each year? Let's guess 10000. Then the Erdos toy proofs with unknown token and resource usage are less than 1%.
...And in 1900, how many carriages were horseless?
In 2026, how many people X-ray their feet at the shoe store or have watches with radium paint?
Ironically, there is a shoe company pivoting to AI. My taxi driver told me buy the stock:
https://www.bbc.com/news/articles/c98mrepzgj7o
a) Individualized teaching methodology. We come with different backgrounds, therefore different types of analogies/examples, different levels of background material, different (but systematized) levels of presentation should be used. The same ask should be applied to kids learning through starting at preschool.
b) Readable mathematics papers where the compact notations are abandoned, and narrative, visualizations are introduced, while preciseness is maintained. It is possible that the same paper (or chapter or topic) should be renderable in multiple ways (for professional mathematicians in the field, for a casual reader, for a student, for an individual reader (as for (a) )
c) Mathematical logic / tooling for differentiable data/event computing. Where there are mathematical tools as well as CS implementation of this tools that allow to act on a difference in state, data, actions.
Typical mathematics (with exception of may be time series), does not view time as 'first class citizen' so to speak, be it abstract algebra and category theory or something else. But, I think, when we go to the 'applied world' we must introduce 'time dimension' as first class citizen. So having the mathematical machinery dealing with this dimension in organic way across many of the areas of mathematics -- will be beneficial to the application of this one of the most valuable human tools.
this guy is resigned to feel worthless compared to other mathematicians (suggesting he become cannon fodder in some type of mathematical sacrifice. i wonder if that analogy even makes sense in the field xD).
but, he desperately wants to become a great mathematician who creates completely original work.
from my experience, people tend to or even want to limit themselves. they think they know the ceiling of their capabilities and it becomes some self fulfilling prophecy.
if you really care about doing something great like this guy does, don't limit yourself. push until you achieve the greatness you want to achieve.
it's like that one saying, aim for the stars and you might land on a cloud. you will be surprised at how capable you actually are
For context - the top answer was written by Bill Thurston, who was awarded a Fields Medal. (Kind of like a nobel prize for mathematics.)
I think the answer is to do multi-disciplinary work.
Venture outside of pure theoretical math. Learn some other domain knowledge and combine it with your mathematical ommph. That's the easiest way to make an impact now rather than potentially decades later.
There are many practical jobs left for mathematicians. Time to discover what you like to do with your hands.
If we see our contributions as brownian motion rather than preconceived trajectories, then, rather than focusing on the Gausses, Einsteins, Patons as providing singular progress, they become the the dominant least energy paths to what we recognize as truth. Without negating the individual’s contribution, the ones we see as truly important are the ones that supported by every other’s attempt, finds the path forward. This should provide hope, if we can leave aside our egos and focus on humanity, we can, and do, all contribute even though a few seems to get all the credit.
This also goes for AI, it may be an accelerant in research, but the probability distribution of reality is large, large enough for humans to wonder, ask questions and stumble upon a new path forward, that computers alone don’t find.
Yeah I don’t think invention or technological development is inevitable or random. It’s path dependent and colored by individuals and culture.
Mathematican here. Writing software because it pays better.
Fortunately doing something novel is one of the main things llms can't do.
But unfortunately human knowledge accumulation and advancement over the last many thousand years has been pretty large deep and varied.
Finding something novel for phds or profits or crime or whatever th fk is harder everyday.
Maybe they can:
https://news.ycombinator.com/item?id=48071262
At the very foundation, chaining sentences together is what we call logic.
Chaining unrelated sentences is retarded. Chaining sentences like most people is common sense. Chaining sentences airtight is math.
You ask what a true mathematician does. He chains sentences like everyone else but with an effort to make them airtight.
Today, even understanding what new mathematics is being done in a particular zone of the mathematical universe appears to require a four-year graduate program of constant study just to be able to follow some mathematician’s original work - and that’s only going to give you a window into a rather narrow subsection of mathematics. The days when people of great talent like Euler and Gauss could contribute to many areas of mathematics are long gone.
But mere mortals can still derive great satisfaction from following along in the footsteps of past pioneers, possibly adapting their work to new problems in a minor way, or just creating educational visualizations and tools that help other people understand things like Galois theory, Poincare phase space or Markov chains, which can be applied to quantum mechanics, orbital dynamics, or protein sequence analysis. That’s valuable, even if no Fields Medals will be coming your way.
For the core discipline, though, I’d mostly worry about lack of opportunities for serious mathematicians to practice their craft in the USA due to the trends of academic budget cuts, anti-intellectual rhetoric, insistence on profit generation as the only rationale for doing anything, etc. Looks a bit 1930s Germany to me, at least here in the USA.