Mathematical Notation, Past and Future (2000)
stephenwolfram.comFor a deeper exposition on language and notation, Guy Steele[0] is the definitive authority :-) - with some discussion here[1].
I love math but prefer programming type syntax to express math ideas, rather than much of traditional math notation. Much of it feels too alien and non-intuitive. Too irregular and random, and not predictable. Various styles duct taped together over centuries. The names of the various symbols are hard to remember, and dont get me started on the pain from not being on the standard PC/ASCII/Western computer keyboards.
In short, a similar critique to that of Perl and PHP, except math is even worse.
Show me a Python-like notation for a math construct and I'm good to go. Even a Lisp!
You might like Structure and Interpretation of Classical Mechanics by Sussman and Wisdom. It does classical mechanics using an MIT-Scheme library.
Lean/mathlib is a mix of functional programming and classical math notation. The community likes unicode quite a lot, with support by the editor (using quasi-Latex input methods in both VS Code and emacs). Here's an example from graph theory (I removed some conceptually irrelevant parts):
theorem simple_graph.sum_degrees_eq_twice_card_edges (G : simple_graph V) [fintype V] : ∑ (v : V), G.degree v = 2 * G.edge_finset.cardSpeaking of which, there's a comment today about SICM in Clojure: https://news.ycombinator.com/item?id=29714267
Do you have a sample comparison between an expression/statement in math syntax versus in pseudocode syntax?
I think having 15 different ways to express multiplication is charming.
agreed. heh. thankfully though at least 3 of them can be expressed with a normal keyboard:
ab
a * b
a X b
I feel like we haven't even standardized a lot of notation to express ideas in common use...talking about the future of notation is strange as people seem to just invent them as they go and there is no committee that steers this discussion.
This is a huge struggle for me as a computational biologist with a computer science background. When I use a published method I try to have at least a cursory understanding of how the method works. However, many of the papers in my field are now statistics and machine learning heavy. The lack of a standardized notation makes the method descriptions very hard for me to follow. Occasionally the source code published alone with the papers is readable and the implementation is much simpler than I expect. But it's not uncommon for the published code to be a pile of spaghetti with no comments or sensible variable names.
Alright here goes a dumb question.
How much is mathematical notation contributing to the idea that "math is hard" and preventing students to think of math as lego bricks in which you can use a set of functions to build newer functions?
> preventing students to think of math as lego bricks in which you can use a set of functions to build newer functions
Isn't this something that mathematical notation lets you be able to see at all? (I'm also not sure what you mean here exactly.)
By the way, have you heard what Euclid allegedly said to Ptolemy I when the king found the Elements to be difficult? "There is no royal road to geometry." Math is inherently challenging, and it's doubtful that removing powerful tools would make it any less so. I think of notation as being something that makes more complicated thoughts to be easier to have -- misquoting Alan Kay, good notation is worth 80 IQ points.
But surely notations vary a lot in quality. Think for example in roman numerals vs. positional number systems as alternative notations to work with arithmetic. One was surely a great advance over the other. So I wonder, what elements of our modern notations are akin to roman numbers, a suboptimal solution awaiting to be replaced with something vastly better. Is all perceived complexity in math inherent to it, or at least some of that is due to subpar notations?
> Is all perceived complexity in math inherent to it
No, of course not, but I'd argue that every notation that's in use gives practitioners that use it strictly more mathematical capability. Notations make things that previously only geniuses could comprehend become things with wider accessibility.
Consider Roman numerals. Like anything, they are suboptimal, but without them, large numbers are essentially impossible to manipulate. They gave the business class the ability to record their finances and inventory, for example, which is a remarkable achievement. It's great when better things come along like positional number systems -- people can learn the art of division in grade school because of it, rather than needing to leave that to the experts -- but we shouldn't dismiss what was replaced as merely holding us back.
This same pattern comes up a lot, whether it be mathematical notation, programming language syntax, or scientific jargon. Some people crusade against them, while others (typically practicioners of the respective field) insist that these constructs are useful.
My hypothesis is that it's a bias from limited information. You have something with inherent complexity to it: math, music theory, physics, code, you name it. In order to represent those complexities, complex jargon gets formed. An outsider attempting to understand a given field is likely to have a (in these cases correct) assumption that the field is complex: if it wasn't, then they wouldn't be putting effort into understanding it.
In their effort to understand it, they eventually run into something they don't understand immediately or easily. They then assign the majority of their perceived complexity of the system to that first thing they got stuck on.
So instead of thinking "programming is hard" they think "reading/writing source code is hard". The practicioners of the field, with the benefit of hindsight, know that picking up the syntax/jargon/notation was in fact far easier than understanding the concepts they represent. But the outsider can only see the immediate challenge, which is the symbology.
The constructs are always useful, but rarely do they form a coherent whole, because they're formed by accretion. The "outsiders" are usually going to be right: throwing everything out and designing the language from scratch would allow for substantial improvements, regularization, etc. The insiders will however have adapted to the existing system and found its complexities easy enough to navigate. And that is why the USA does not use metric.
Also I just want to add... mathematicians have ⋅, ×, and ∗ to express multiplication, but decided that they also needed ADJACENCY OF CHARACTERS to express multiplication, and therefore mathematics is not allowed to use variable names longer than one letter.
Come the fuck on, mathematics.
It's fine, you just add a space. It's also used for function application, like "sin x".
I'd say single-letter variable names are mostly due to ease of handwriting when doing calculations (or for giving a chalkboard talk), and authors tend to just use what they've already come up with when writing things up. Mathematicians seem to be very tolerant of ambiguous parsing, so that can't explain all of it.
When reading a math proof it helps to spell out what the symbols stand for, instead of saying the symbol names, for example, instead of "E equals m c squared" say "energy equals mass times speed of light squared". This also greatly helps understanding. If you can't keep track because there are too many symbols, make a little lookup table.
You can point to "sin", but in practice math just doesn't use multi-char variable names. The trig functions are special cases. Doesn't mean you're allowed to name arbitrary vars like that.
What about ker, coker, im, hom, end, aut, lim, colim, Set, Top, Man, Mat, Vect, GL, SL, SO, Lie, Gr, Tor, Ext, tr, Rep, char, rank, Isom, ann, hull, Diff, ...? I'm just copying multiletter names out of my preamble.tex file here.
I get that these aren't variables, but it's really not so uncommon outside of trigonometry to have multi-letter names for things that appear in equations.
This has been discussed before a couple of times - Bret Victor keyed me into the discussion of how symbolic maths makes the discipline seem alien by pointing at two things (can't remember in which lecture though): the mathematician's lament, and a letter to leading mathematicians and physicists including Einstein asking how they work with maths (can't remember any key words to Google this). The idea being that mathematical symbols are a tool of communication, not the way you actually work with maths (if you're exploring new ground, at least).
Edit: it was in here: http://worrydream.com/#!/KillMath
When someone comes up with a way to explain the same concepts using a simpler and easier to understand way, over time it becomes the new notation. But it rarely is world changing, mostly it’s small, incremental changes.
One big example of a major notation change is Grothendieck’s effect on algebraic geometry. It used to be about solutions of sets of polynomial equations. Now it’s all about sheaves and schemes and moduli spaces and representable functors and stacks and… This new notation changed the way mathematicians think. Did it simplify things and made it more accessible to lay people? I’d argue, not at all, in fact you probably need a year or two of university math education to fully understand what a scheme even is.
I guess category theory is example in the other direction: it greatly simplified and unified thinking in algebraic topology and geometry. However, it’s not that big of a help to a complete newbie: if you are at a complete loss when you encounter a cohomology theory for the first time, framing it in terms of graded functors won’t be of huge incremental help.
Point here is that notation is there to help mathematicians, not regular people, but regular people neither are able to understand all of it, nor do they care, so it’s not much of a loss.
I think it certainly has an effect. My rationale is comparing the earliest writings on algebra and calculus to the present-day notation. Al-Khwrizmi used no symbols or numerals in his text. I can't guess at Newton because the text is in Latin, but the notation looks pretty difficult to me. At the time, you practically had to be a philosopher to understand that writing, much less do anything with it. Likewise I think Clerk-Maxwell's electromagnetic theory was impenetrable until Heaviside created a new notation for it. And we just had a HN thread talking about Mary Somerville's translation of Laplace.
Today we teach algebra and calculus to schoolchildren. I believe as bad as the notation probably still is, the improvement from the original texts is nothing short of qualifying as its own revolution in mathematics. So it's interesting to imagine it being even better.
But I also believe that we only teach one side of math in school, basically manipulation of expressions. We barely touch on other ways that people do math, through the exploration of data, computation, and theory (proofs). So we're painting a distorted picture of what math even is, making it all the harder to justify why it's important beyond just being a gatekeeper.
An interesting and often overlooked exposition by Dijkstra on notation: https://www.cs.utexas.edu/users/EWD/transcriptions/EWD13xx/E...
(2000)