Learning math on the internet is hard. Different people use different notations and assume different backgrounds.
I encourage the poster to read and learn more about exterior algebras! Once you've seen Stoke's theorem, the Hodge decomposition, Maxwell's equations, etc. written using them you'll never go back and those old school t-shirts with "and god said..."[1] will make you cringe.
I hope/expect to see them taught in basic calculus and physics, but it takes generations for things to trickle down.
I think it was done that way once, but somewhere around seventies people dropped or moved this to advanced courses. You can't immediately solve a transformer in differential forms like the way you can by mindless symbolic manipulation over the standard formalism.
I was chagrined to find myself in this very same position. Sadly, I have not yet had the time to track down the resources necessary to "re-educate" myself. Computer science has given me a new love for math that I didn't even possess during my education, but I know of very few ways to express that enthusiasm.
Perhaps OP should have studied number theory. No number theorist has ever complained about the quality of his teacher. It's just you and all that ever was and all that ever will be. Now go discover.
Sussman's critique of mathematical notation is hollow. The point of good notation is to hide the details while conveying the high level compactly and succinctly.
Lagrange's equations written the way he derides are a perfect example. His need to think about what spaces the symbols live in based on their context in the expression is typical when translating math to code. He should be talking about about what kind of type inference engine one needs to allow the original notation and end up with his positional notation internally.
Actually, the notation is confusing even to professional mathmaticians. The notation used in differential geometry is the archetype of this phenomenon; it's sometimes incredibly hard to work out what the simple-looking notation means.
In fact, differential geometry has been described as the study of objects "invariant under a change of notation" playing on the lack of clarity in this field, along with the standard description of a tensor as an object "invariant under a change of coordinates".
"At school Feynman approached mathematics in a highly unconventional way. Basically he enjoyed recreational mathematics from which he derived a large amount of pleasure. He studied a lot of mathematics in his own time including trigonometry, differential and integral calculus, and complex numbers long before he met these topics in his formal education. Realizing the importance of mathematical notation, he invented his own notation for sin, cos, tan, f (x) etc. which he thought was much better than the standard notation. However, when a friend asked him to explain a piece of mathematics, he suddenly realized that notation could not be a personal matter since one needed it to communicate."
The story is either in "Surely You're Joking, Mr. Feynman" or "What Do You Care What Other People Think?". I forget which stories are in which book. Both are great though.
I thought my symbols were just as good, if not better, than the regular
symbols -- it doesn't make any difference what symbols you use -- but I
discovered later that it does make a difference. Once when I was explaining
something to another kid in high school, without thinking I started to make
these symbols, and he said, "What the hell are those?" I realized then that
if I'm going to talk to anybody else, I'll have to use the standard symbols,
so I eventually gave up my own symbols.
Good teachers do try to use consistent notation, but they also understand the need to communicate with the rest of the world and so end up having to teach historical accidents.
I thought someone might try that tack. Math is completely decoupled from its representation, so you can use whatever notation you want and not lose anything. I'm not gonna guess to what extent linguistic communication is bound up with its representation, but it's certainly more than math, meaning meaning can be lost. So your analogy doesn't work. Also, since mathematical notation is a tool primarily to facilitate mechanical transformations, and only secondarily for communication, you should use whatever lets you think biggest.
An excellent point as long as you do not need to actually communicate your results to anyone. As soon as you do, it becomes precisely as bound as language is.
Sussman's critique of mathematical notation is hollow.
I don't do enough math to have a useful opinion, but it strikes me that something as simple as 1 + 1 = 2 is ambiguous (does f = x+x or x+1?) and using unambiguous notation would be tedious and useless.
Unless you're trying to turn it into a program. I sometimes think of programming as the opposite of math.
Why does he say that the arity is different for derivatives and integrals? Isn't there an arity 1 symbolic integration? If you were numerically computing an integral over a range, shouldn't you be evaluating the derivative at a point? These don't have the same arity, but at least the ideas are parallel.
I'm a math idiot, too, but I know that he's confusing indefinite and definite integration. Definite integration returns a number; indefinite integration returns a function. Differentiation does undo indefinite integration.
> ... I know that he's confusing indefinite and definite integration.
Yes he is. The real solution is to banish the term "indefinite integral" from the language, and substitute "antiderivative". We should also not use the integral sign for antiderivatives, but unfortunately there is no well known alternative. :-(
Using the same terminology and symbols for antiderivatives as for (definite) integrals is confusing, as here. It also makes the Fundamental Theorem of Calculus -- a very profound fact if there ever was one -- appear to be something trivial about getting rid of limits on an integral sign. But consider: one can compute a definite integral using an antiderivative ... whodathunkit?
Isn't it weird how awful math terminology is? Even the word "derivative" is an awful choice. The special and wonderful thing about a derivative is that it's the rate at which another function changes, not that it's derived somehow.
And why is the verb for "take the derivative" "differentiate"? Both words make sense individually to describe things -- df/dx is derived from f by taking differences -- but it doesn't make sense that we use both words.
I had an abstract algebra professor, in my first year of grad school in math, who referred to operators D that satisfy the product rule D(fg) = D(f) g + f D(g) , which are not necessarily defined in terms of a limit, as "derivations". This is a natural generalization of the derivative, and apparently "derivation" is the correct technical term, but for a week or so I thought he was just not speaking English properly.
Sort of, for well-defined functions (for some definition of well-defined). I remember seeing an example in analysis which showed that it was possible to get some nonintuitive behavior in functions.
You might enjoy "Counterexamples in analysis" [1], it shows, for example, a two argument function for which the order of integration matters, while the geometric interpretation would suggest otherwise. A must have!
Yeah, that brings up another of those peculiar things about math, and especially, dealing with math culture. It's hard to talk about a main idea without getting mired in weird exceptions.
There are indeed precise conditions that guarantee that a function can (a) be integrated and (b) the integral can be differentiated, and there are subtly different definitions of "integral" that lead to different sets of extremely weird functions being integrable or not. I can't remember any of the details despite having come across them many times.
In software culture, it seems much easier to talk in broad strokes when appropriate and in precise details when appropriate. You might get an argument, but it's usually about something relevant.
Duh. It just hit me. Math is mostly edge cases. Consequently, the urge to be have perfect logical validity (same as the urge to write a bugless program) requires you to spend most of your attention on the edge cases.
That urge is stronger in mathematicians than in software engineers.
I have a similar, but slightly different problem. While I know that CS and programming in general can be translated into useful math knowledge, I always mistrust my math. I suspect that i don't understand the math correctly because I understand it, if that makes any sense. The weird terminologies and reputation for being way hard always erode my confidence in my understanding.
I've found that mostly it's just that mathematics is usually taught extremely poorly. Mathematicians and mathematics educators typically gloss over definitions and use terrible notation. Further compounding the problem, the mechanical aspects of mathematical computation are rarely formally defined -- talking about elementary operators and objects of a mathematical subfield is rare.
I've recently discovered abstract algebra and I love it. We need to find and celebrate good explanations of mathematics both to do better work and to give future generations the mathematical education we didn't get.
I'm asking because I found myself deeply disappointed with it when I took it in school. It seemed to me just a haphazard collection of definitions and puzzles. Why, for example, does anyone care how many non-abelian groups of order 18 there are? I found myself unable to retain much of anything, due to the lack of any coherent explanation of "What is the point?"
In computers, by contrast, no one hides the point. Searching, sorting, bits and bytes, languages that make it easier to express things without making mistakes—there's no mystery about the purpose or importance of anything.
I wonder, if someone showed me what is attractive about abstract algebra, maybe I would find myself just as addicted to math as to programming.
I love that it provides a systematic way of thinking about structures and proving properties of said structures that I am working with in code. It's amazingly powerful to say "type t defines a monoid with operation k: t -> t and identity Nil" because it comes with a whole slew of properties and theorems. It meshes very well with strongly typed functional languages that support algebraic data types (e.g. Haskell and OCaml).
I don't much care for theoretical abstract algebra (e.g. non-abelian groups of order 18) and I am by no means a master (or even an amateur!) but I find abstract algebra's concepts useful and I hope to learn more!
I know nothing of abstract algebra. In fact I don't even know what it is :). I do however view mathematicians in the same way I see the programming languages crowd: they like to play with these ideas and extend them for their own sake. The solve these puzzles and show that this principle is true, or that group of formulae can be generalized like so. It's up to the rest of us to notice when a particular problem fits some solution they have provided. Another part of this is: math is like alpha, and computer programming is blub, at least according to the math guys.
My high school math teacher refused to teach calculus. We spent year after year on algebra, trig, geometry, and did proofs. Lots and lots of proofs. He told me not to worry about when I went to college and studied calculus as I would be much better at it than the students that studied calc in high school.
Smart man, he was right. The first two weeks in my first college calc class I felt a little uneasy and understood that I was the only person in class that had not studied this stuff in high school. After those first two weeks though, it was not a problem and I swept through 6 calc classes without a sweat.
20 years later of course, I'm a math idiot again...use it or lose it ;)
I enjoyed this post a lot because I traveled the opposite path. I started out in computer science with little interest whatsoever in math all through high school.
Then when I got to college they introduced the notion of "program correctness" where you tried to prove, mathematically, that your computer programs were correct. This convinced me that computer science was simply a knock-off approximation of mathematics and I drifted away from it.
From your derivative example above I’d say it is this inexact nature of computer science that you especially like. Different strokes for different folks, I suppose.
It's not "different strokes," it's a change of perspective causing massive confusion all 'round. Computer science isn't a knock-off approximation of mathematics, it's a subfield of mathematics. Programming, which is what the post is talking about, is a third thing. The derivative is the inverse of the indefinite interval (and has the same arity), but the function he refers to computes a definite integral which is not the same thing. Etc.
Confusion about this stuff is a sign that you didn't really understand it the first time. Believe me, I've been (am still) there...
There is even more confusion out there.
Think of abstract algebraic structures.
Who needs groups?
No one. Until one realizes that certain simple equation transformation rules are not based on natural numbers, or real numbers but on groups. Once something like that clicks in your brain, you can suddenly solve weird equations with sets and symmetric differences, or bitstrings and xor's or other pretty akward things which looked really scary earlier.
Who needs rings and semirings?
No one, until one realized that a certain algorithm requires a structure... and this structure is a semiring! Thus, if one can prove that two operations and a set form a semiring, you can apply this algorithm without any effort, because it will just work! :)
Or, even more. Who needs the theory of katamorphisms, Anamorphisms and such? No one. Until one realizes how beautiful recursive datastructures are and how easy it is to program them once you understood the idea behind them (Check http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.41.1... for a nice paper about this)
Who needs integration (besides those numerical people)?
No one, until one realizes that any simulation, most iterations and such are discrete intergrations in very, very akward and convoluted algebraic structures.
Ach, I somehow wish I studied math before studying computer science by now.
Yeah, I tend to say "computer science" when I mean "theoretical computer science." To me there is a clear split between the stuff that's math/logic and the rest of it which I'm happy to lump under the catch-all of engineering (and don't feel comfortable labeling as any sort of science).
I'm not comfortable labeling the concepts behind operating systems, networks, computer architecture and programming languages as either math or engineering.
I think what he's getting at is that the math taught in high school and applied college courses isn't fully formalized, often isn't fully explained, and even (as taught) contains ambiguities. I was a math major (and took graduate level analysis, topology, and set theory as an undergrad,) but the applied math classes I took felt really rushed and confusing. I actually felt kind of lost and dumb in sophomore DiffyQ, and what's worse, I was surrounded by engineering students who were having no problems. I sometimes felt dumb studying Lebesgue integration and transfinite induction, but I never felt lost like I did in DiffyQ. It didn't feel like math at all; I loved math but hated sophomore DiffyQ.
My experience with math in school is that it was taught as if it were modern art: a bunch of meaningless procedures (up through calculus) and meaningless puzzles (beyond calculus).
If it were taught as providing some insight or serving some purpose, it might be a lot easier to learn and retain.
I encourage the poster to read and learn more about exterior algebras! Once you've seen Stoke's theorem, the Hodge decomposition, Maxwell's equations, etc. written using them you'll never go back and those old school t-shirts with "and god said..."[1] will make you cringe.
I hope/expect to see them taught in basic calculus and physics, but it takes generations for things to trickle down.
[1] http://images6.cafepress.com/product/96599546v11_350x350_Fro...