A minor fix to calculus notation
April 11, 2019 6:55 AM   Subscribe

"The problem is well-known but it has been generally assumed that there is no way to express the second derivative in fraction form. It has been thought that differentials (the fundamental “dy” and “dx” that calculus works with) were not actual values and therefore they aren’t actually in ratio with each other. Because of these underlying assumptions, the fact that you could not treat the second derivative as a fraction was not thought to be an anomaly. However, it turns out that, with minor modifications to the notation, the terms of the second derivative (and higher derivatives) can indeed be manipulated as an algebraic fraction." Paper (pdf).
posted by clawsoon (32 comments total) 22 users marked this as a favorite
 
This is interesting theoretically, but I cannot imagine it being adopted universally.
posted by wittgenstein at 7:02 AM on April 11, 2019


Thanks, I hate this
posted by RolandOfEld at 7:17 AM on April 11, 2019 [8 favorites]


*intense Calc3 flashbacks, stress, panic*
posted by lydhre at 7:21 AM on April 11, 2019


Can someone explain the implications of this beyond what gets written down in classrooms? Will it help us unify some set of theories, or simply some costly process?
posted by mantecol at 7:25 AM on April 11, 2019


Good paper, the new notation is a little (or a lot...) more unwieldly but it'll make higher order manipulations easier for those who dislike memorizing formulas.
posted by subdee at 7:29 AM on April 11, 2019 [3 favorites]


The implications are at once mind-blowing and trivial.

The mind-blowing part is: we've been doing this wrong for a long time. We tell our students "these things that look like fractions aren't actually fractions, don't forget that. Now to solve this next problem, pretend it's a fraction..." One of the reasons we say that is exactly what this paper addresses, which is that the fraction idea breaks down at higher derivatives. The funny thing is, basically the conclusion is: you can treat these things (differentials) as fractions *except* when you're working directly with these things.

It's trivial in the sense that nothing all that new will come directly out of it. It's just a slight reorganization in the way to think about using Leibniz notation, which is something we've already had to be doing.
posted by dbx at 7:34 AM on April 11, 2019 [11 favorites]


The paper they submitted is fairly readable for someone who remembers calculus well enough to remember second derivatives. It takes a conversational tone, something welcome in a math paper.
posted by Nelson at 7:43 AM on April 11, 2019 [7 favorites]


This makes me happy because it makes total sense! It makes me sad because it won't actually be adopted and I can't teach it to my kids lest they get super confused when they learn calc with everyone else.
posted by a snickering nuthatch at 7:53 AM on April 11, 2019 [3 favorites]


Can someone explain the implications of this beyond what gets written down in classrooms? Will it help us unify some set of theories, or simply some costly process?

Mathematical notation is a tool to assist with and simplify reasoning about mathematical objects. While classroom usage is one valuable case, good notation is most useful for mathematicians and people who apply mathematics in their work. In this case, the proposed notation helps with reasoning correctly about the behavior of higher-order differentials as first-class mathematical objects, rather than only using them as stand-ins for limits. For some people, treating differentials as "real" fits their mathematical intuitions well, and so better tools for reasoning about them can be helpful. The paper gives an example of a convenient inversion formula for second-order derivatives which was previously described but not widely known, which the authors rediscovered easily using their new notation.

An analogy to this might be something like spelling reform. Everyone knows English spelling is overly complex and irrational. It doesn't stop us from communicating effectively, but it's harder to learn than it needs to be and even experienced people sometimes maek mistakes. Mostly we're stuck with it, because there's just too much cultural inertia to change. But sometimes reform is possible and arguably does make things easier, as with Webster's reforms of American English spelling. I'd predict this will be similar. Mostly no one will change their notation for higher-order derivatives, but some smaller and/or newer disciplines may find it useful and adopt it.
posted by biogeo at 8:44 AM on April 11, 2019 [11 favorites]


One of the authors, Jonathan Bartlett seems to be a bit of a polymath. he has papers in a number of disciplines from a cursory googling. The institute he belongs to, The Blythe Institute, also seems to have a somewhat fringe quality to it.
posted by Dr. Twist at 8:56 AM on April 11, 2019 [1 favorite]


A minor fix to 40 years of my nightmares.
posted by hwestiii at 9:06 AM on April 11, 2019


At a math class I recently attended, the professor described his view of the question on whether mathematical works are discovered or intended. He prefered to think of math as being designed, as a user interface for looking at mathematical entities, and these in turn aren't quite discovered in that you're never the first to be interacting with them, but you might be the first to design a better way to explain them to others.

In that sense, mathematics is not really a universal language. Algebra started as Muhammad Al-Kwarizmi's writings in plain Arabic prose. It took the development of notational systems to make his work accessible to 7th graders, and mathematical notation comes from a particular cultural context: the development of moveable type in nations using the Latin and Greek alphabets. 300 years of refinement made math as we know it seem universal, but it wasn't, and isn't.

In one particular context, it should be the most blatant. The notations for trigonometry and logarithms have their quirks as opposed to the rest of math, specifically because the topic is quasi-vocational. Not that long ago, a teenager who know his sines and logs could use them to get a job as a draftsman or a surveyor's assistant. With people other than mathematicians using the notation, they evolved it in different directions. In another context, there's calculus, with Newton and Liebnitz developing different notations for it. And now this.

It did not take computer programmers long to see that standard mathematical notation is not the only way to do it, nor is it the best. They had to work around the limits of the computer keyboard, RAM and hard drive space, and agree on variants.

I should stop now and read this paper.
posted by ocschwar at 10:16 AM on April 11, 2019 [3 favorites]


ocschwar: It did not take computer programmers long to see that standard mathematical notation is not the only way to do it, nor is it the best.

I've been idly wondering for a while how many people would do better at understanding math if well-named variables were used instead of single letters. With single letters, your ability to understand a mathematical idea is limited by how many single letter->idea mappings you can juggle in your head at one time.

Single letters in math are like single-letter commands in Vim: They can make you more productive once you learn them, but they put up a high barrier to beginners.
posted by clawsoon at 10:32 AM on April 11, 2019 [4 favorites]


Re single-letter variables, you don't want to obscure the structure of an expression with too-long names, but if the variable has a particular role to play, you might want to illuminate that with a longer name. I'm sure that eventually it will be mathematically determined that the optimum compromise is that, in expectation, each variable uses e letters.
posted by a snickering nuthatch at 10:54 AM on April 11, 2019 [8 favorites]


I've never gotten around to reading Al-Kwarizmi, but I do remember realizing partway through my college math degree that all equations are telling stories; they have a narrative structure. (I'm not the first to realize that, I'm sure; like, what's a word problem, duh?) Maybe I'd like Al-Kwarizmi's original approach better than the way I actually learned it.
posted by clawsoon at 11:00 AM on April 11, 2019 [1 favorite]


The paper reads as a solution in search of a problem (and it doesn't help that the linked article claims this is a discovery of a "longstanding flaw" in calculus, which is wildly untrue). The thrust of the article is that notation in mathematics is often overloaded. While this is true, I am very unconvinced that it represents a flaw or something that needs fixing. In particular, it is unreasonable to claim that fractional notation must ALWAYS mean division and a superscript must ALWAYS mean multiplicative power. This is definitely not something one should argue!

There are many areas of mathematics where notation in one field means a different thing in another; this is just a result of the fact that there are a finite number of reasonable symbols that can be created and written by human hands. This does not need fixing. Further, the "fix" proposed here replaces something conceptually simple (apply the differential operator d/dx or d/dy some number of times) by something unwieldy and harder to understand - even the authors admit that "it is not very pretty or compact, but it works algebraically." (!!!!!)
posted by Frobenius Twist at 11:02 AM on April 11, 2019 [4 favorites]


A good notation has a subtlety and suggestiveness which at times make it almost seem like a live teacher. — Bertrand Russell
posted by jamjam at 11:07 AM on April 11, 2019 [4 favorites]


I found the paper hard to follow, all the listed equations look similar and it would be nice to tell at a glance to compare the standard vs proposed notation in a chart on a single page. The way the paper is organized I actually have to read it.
posted by polymodus at 11:13 AM on April 11, 2019


Divide it all by Tau over zed. (Π * 2 / e + 1)
posted by sammyo at 11:19 AM on April 11, 2019


They get into their general "fixed" notation starting in section 5. Hilariously, their "fixed" notation is so gnarly that they suggest collapsing it into a simplified "D" notation. This means that they have taken the straightforward current notation, expanded it into something nasty, and then collapsed it again into notation which looks quite similar to the original notation... except that now one needs to memorize the meaning of the "D" symbol!

Having taught calculus many times, I very much want to see the authors get up in front of a class full of undergrads and attempt to explain that no, really, this new notation is a good idea.
posted by Frobenius Twist at 11:20 AM on April 11, 2019 [2 favorites]


We tell our students "these things that look like fractions aren't actually fractions, don't forget that. Now to solve this next problem, pretend it's a fraction..."

It's been a good long while since I gave any serious thought to this kind of thing, but for me the "pretend it's a fraction part" is the problem. I don't see much to gain from it, it just muddles the concepts and promotes handwaving through the hard parts - which is were the interesting stuff actually happens.
posted by each day we work at 12:11 PM on April 11, 2019 [1 favorite]


(But then again I was the kind of person who annoyed my analytical mechanics professor by using differential-free notation for integrals after the little Spivak, so take with a grain of salt)
posted by each day we work at 12:13 PM on April 11, 2019


Maybe I'd like Al-Kwarizmi's original approach better than the way I actually learned it.

Plain prose in Arabic, a language that's notorious for being ambiguous about mapping pronouns to antecedents, to explain a body of mathematics that's all about pronouns and antecedents. Written by a guy whose native language wasn't Arabic.

It takes intelligence and effort to read Al Kwarizmi. It took utter genius to BE al Kwarizmi. BUt translate it to ordinary algebraic notation, and it's just a serious of 7th grade algebra exercises.
posted by ocschwar at 12:37 PM on April 11, 2019 [6 favorites]


The new notation trades one thing off for another; the old notation is logical if you think of it as the result of applying the operator d/dx to y (once, or twice, or however many times). The new notation obscures this idea in exchange for clarifying a different one.
posted by Wolfdog at 12:43 PM on April 11, 2019 [2 favorites]


He has come pretty close to reinventing calculus using differential forms.

I will give him that it's helpful for students to think about differentiation as not always with respect to some variable because then you get to multivar calc and PDEs and differential forms and you have to significantly tweak your calc know-how a bit.

I still think the idea of teaching calculus using nonstandard analysis (using infinitesimal members of the hyperreals instead of differentials) is an easier way to get your feet wet however.
posted by adoarns at 2:07 PM on April 11, 2019 [1 favorite]


I'm not sure which is the bigger signifier that I am a nerd: that I read and understood the entire paper (math nerd), or that I found a typo in footnote 7 (grammar nerd/pedant — the seventh line should start with "Its" not "It's").
posted by judgement day at 2:49 PM on April 11, 2019 [2 favorites]


The paper reads as a solution in search of a problem

That was my first take as I started to read it, too, but by the end I changed my mind. I'm not convinced that this is a notation I'd use, but I think there could be a valid use case for it.

(and it doesn't help that the linked article claims this is a discovery of a "longstanding flaw" in calculus, which is wildly untrue)

Strongly agree. To continue my spelling analogy from earlier, this is like saying they've discovered that the generally accepted spelling of "enough" in English is a "longstanding flaw." Because a) everyone already knew it was a problem, and b) it still doesn't actually stop people from communicating in written English.

Further, the "fix" proposed here replaces something conceptually simple (apply the differential operator d/dx or d/dy some number of times) by something unwieldy and harder to understand

Well, I think it's worth noting that d/dx is the derivative operator, not the differential, and basically this paper is about abandoning this operator in favor of d·, the differential operator, such that the ratio of differentials is the derivative. If you think primarily in terms of derivatives according to the classical limit definition, their notation is complex and unwieldy. If you think primarily in terms of differentials, as physicists often do, I can see a certain amount of clarity added by this notation.

Hilariously, their "fixed" notation is so gnarly that they suggest collapsing it into a simplified "D" notation.

The "D" notation is already standard in some fields. They don't make this argument explicit, but it seems to me that the thrust of the notation is to reserve d· as the differential operator, D·· as the derivative operator, and abandon d/dx as a synonym for the Dx operator. That way, Dx2 ≠ d2/dx2, and you don't get the awkward effects from applying the chain rule.
posted by biogeo at 2:57 PM on April 11, 2019 [4 favorites]


(and it doesn't help that the linked article claims this is a discovery of a "longstanding flaw" in calculus, which is wildly untrue)

That's why I put "overblownsciencewriting" in the tags, and toned down my title. Hope that helped a bit.
posted by clawsoon at 3:29 PM on April 11, 2019


One of the authors, Jonathan Bartlett seems to be a bit of a polymath.

That's a polite way to phrase it! I would say less tactfully that he is engaging in mathematician cosplay.

The paper is poorly organized and devoid of original math content, the journal is sketchy, the institute is super sketchy, and the author has no apparent math background, but has apparently published papers on biology, programming, and intelligent design. For all I know he is qualified in those fields (well, inasmuch as one can be qualified in creation science) but there's no evidence that he understands mathematics.

This is not a serious academic paper. It's only a few rungs above the division by zero guy.
posted by YoloMortemPeccatoris at 7:43 PM on April 11, 2019 [1 favorite]


Ha ha, so much for my charitable reading. I still see a potential use case for the notation, but you're right, the guy's clearly a crank.
posted by biogeo at 10:26 PM on April 11, 2019 [4 favorites]


Sure, I wouldn't have a problem with a paper that argued for different calculus notation in, say, a math education journal. It's certainly true that Leibnitz notation for second derivatives has unintuitive aspects, and an argument on pedagogical grounds could be constructed.

What set off my crank-dar here (and led to the google-stalking) is that the author seems a bit fuzzy as to whether he's doing original research or merely recommending slightly different notation for well-known concepts. See, for example, the footnote at the bottom of page 226. This whole thing reads like he set out to write an original research paper, found out in peer review that none of the results were in fact original, and published it anyway in whatever journal would take it.
posted by YoloMortemPeccatoris at 1:03 AM on April 12, 2019 [4 favorites]


he set out to write an original research paper, found out in peer review that none of the results were in fact original, and published it anyway
I had the same impression, but I don't see why that is a bad thing. Are journals only supposed to publish new research? That seems unnecessarily restrictive, surely much can be gained by revisiting a topic. Or do you think this paper is too basic for a mathematical journal? That is probably true if I am able to read it.

I like the notation he presents, it makes it more clear what the difference is between the top and bottom halves of dy/dx. The top is a function call being passed a variable and the bottom half is a single token representing an infinitesimal. The standard notation makes them look like the same thing.
posted by foobaz at 7:48 PM on April 12, 2019


« Older “The parties used to be great,” Nancy said. “Until...   |   Charles Koch Institute Trains Future Journalists Newer »


This thread has been archived and is closed to new comments