If we had a GOF style
MathPatternLanguage, maybe, one day, not impossibly far in the future, a little girl will pull a doll off a toy store shelf, press her tummy, and hear, "
MathIsEasy!".
Is this achievable? If not, why not?
Has the GOF made programming "easy"? No, so why should some similar idea somehow applied to mathematics make mathematics easy?
Patterns have made
understanding programming much easier. I remember describing an observer pattern to a bemused gaggle of junior programmers back in the day. It took a week for some of them to catch on. Now, of course, you can toss the thing off with a single word. Programming isn't easy, but it's a lot easier than math.
Looking at the two activities side by side, it's not at all obvious why math should be so much harder. Its history isn't much longer. Its proofs aren't much more complex than programs. We know code is easily obfuscated; who's to say math isn't just obfuscated?
Look, here's a concrete example:
AlexanderAbian's "TheTheoryOfSetsAndTransfiniteArithmetic
?" is a very, very easy book. It's still real math. A lot of the reason it's easy is Abian's superb English description of the contents. You can actually read the English without the symbology and get about 80% of the book. And then the meaning of the symbols is just obvious. If it can be done once, why can't it be done all the time?
This is all quite silly. Some programming is easy, some is hard. Some math is easy, some is hard. Each can be very easy, and each can be very difficult. "Math is just obfuscated"? Please. Of course, a great deal of mathematics is written badly, just as a lot of programs are badly written. But much of the hardest mathematics is indeed very very hard. That is, hard for normal people such as you and I to understand. You cannot blame mathematics (and mathematicians) for your own failings, and hope for some "silver bullet". In the future, should we develop substantially better tools for explaining mathematics, more of us will be able more easily to understand more and more mathematics, should we want to, and be willing to put some effort into it. But much of it will remain very difficult, if only because of a kind of Parkinson's Law.
I wonder if someone who apparently cannot understand the argument that the halting problem is undecidable, and apparently thinks he has found flaws in what is a pretty simple proof, has really "gotten" 80% of a book such as Abian's.
Now now, that's not a fair direction you're going in. I agree with him that math is harder than it has to be, and could be improved. Of course, I'd also agree with you and others who say much of it is inherently difficult, but the two things can be true simultaneously. -- Doug
Please. He didn't say just "math is harder than it has to be". He suggested some silver bullet panacea, which isn't even a silver bullet panacea for programming to begin with. And I'd like to see how a "pattern language" for mathematics could prevent the perverse misunderstanding of the proof that the Halting Problem is unsolvable that we find on
GeneralHaltingProblemProblem.
We can only pray that something could prevent such things. :-)
Um, thanks for that, I think. I understand Turing's proof just fine but see no reason the thing can't be poked at, nor had fun with. Since I am no mathematician, and don't pretend any such thing, I won't be shamed by accusations of asking stupid questions. I'd be ashamed not to ask stupid questions. Anyway the links to Jacquette/Meinong that fell out of GHPP were unknown and useful to me, thanks much to mr squarebrackets, whoever he is.
[[If you are saying your egregious misunderstandings of the proof of the unsolvability of the
HaltingProblem are all feigned, that's okay. I am certain there are people who really do have these egregious misunderstandings, so it doesn't hurt to have a version of them up there, with explanation of why they are wrong, even if the author is just joking about it. Anyway, thank God you don't really believe the baloney you wrote! As for Jacquette, he is a useful example of a crackpot who really does think he has produced great, radical, solutions to something he cannot understand. He for one is not joking, I am pretty sure.]]
I haven't had time to read him yet, but find intuitionism attractive for obvious reasons. So I'm grateful for the pointers, substanceless or not. As to feigning, no, that page started by stating that the argument was semi-humorous, and I expected holes to be poked without much trouble. It is ridiculous to expect that Turing's A-machine proofs are flawed when a great many excellent minds have had generations to find any such flaws. The argument was put up for my own education and for the entertainment value. I presumed that claiming Nobels in mathematics for ranty wiki pages was obviously self-deprecatory. Plainly I presumed wrong, mea culpa.
[[No no! Don't read Jacquette unless you want to do research on cranks and error. If you can specify what is is what you are after, perhaps someone can provide a good reference]].
That's pretty much what I'm trying to do, but I need to lay down a lot of context before I can get at the core of my interest.
ThereIsNoInfinity,
ThereAreNoPoints, etc., are obviously cranky, or at least so left-field as to seem so, but I know of no straight math that goes there.
[[A-machines? We are just talking about recursive functions, Turing Machines, etc., not "A-machines". In terms of computational power, there is no other kind of machine, save these machines augmented by oracles. And "Great minds" have not been examining the transparently correct proofs of the unsolvability problem for generations. The issue was closed virtually from the start]]
You misunderstand me. I did not say "great minds". I said "excellent minds". Every undergrad that's exposed to the proof wrestles with it. Anyway I continue to assert that the proof is affected by
FrameProblems, but this goes to its relevance to physical computers, not its validity in original context.
As to C-Machines, please refer to the original Turing paper and let me know if you've heard that someone subsequently proved that either all C-Machines are O-Machines, or that all C-Machines are A-Machines, or that other computability proofs exist for C-Machines.
As for panaceas, silver bullets, and so on, it's no crime to use a compelling
WikiName to first see whether someone has not
already done a math patterns book, and second draw folk out to try to explain why they think math should be hard; the parts I understand always seem easy to me, and the parts I don't always seem hard. From this it seems to me that it's not math that is hard, but reading math that is hard. If a
MathPatternLanguage existed, I think it would help a lot. At least I think it would help me a lot.
[["Math", really, is
doing math, and this generally is going to be harder than just
reading math. So
comparatively speaking, you have it backwards: math is hard, reading math is easy. However, again, reading math could definitely be made
easier than it is, possibly with something sort of like a pattern language. But that is not going to make reading math
easy. There is also "writing math", which is going to be hard if you want "reading math" to be easier. ]]
I like this distinction, but consider that far fewer people write than read
anything - proofs, architectures, stories, you name it. All creative work is hard, but I believe
MathIsHard, at least in the Barbie sense, because of the difficulty of reading it, not writing it.
[[I doubt this. Maybe in each genre fewer people read than write, but "far fewer"? A lot of code is read only by its author. A lot of academic papers are probably only read by the reviewer (if that only cursorily), the author, and a few of his friends (if he is lucky). Poetry? All sorts of people write stuff they call "poetry" which at best, they can probably force a friend or two to read...]
I can trivially prove a special case of a known general proof. But it's not creative for me to do this. I can write twaddle on wiki, but that's not creative either. So for the various items of code that are really just cut and paste parameterizations of well known patterns - it takes nothing creative to write them. And most stuff called poetry too. Consider, to bring an example close to home, Ward's original 300 line wiki-engine. Almost anyone can read it. And many can trivially adorn it with different feature sets. But no one else could create it. Stallman's GPL is another good example. And so on.
Another example: the semi-excellent hippo school books
WhoIsFourier and so on. These explain quite a lot of hard concepts by addressing them almost exclusively to junior high students. And also having them do the writing. The same bunch of brilliant pedagogues attempt to teach 16 human languages all at the same time. Apparently with success. The reason they started writing math/physics books is that one day it occurred to them that math is itself a human language.
- Very interesting! Never heard of this before. The organization is "transnational college of lex" http://www.lexlrf.org/college/ . Book "Who is Fourier": ISBN 0-9643504-0-8 ; "Anyone can speak 7 languages!": (ISBN 0-9643504-0-8 ) -- link doesn't work because Amazon never heard of it (the link works when the hyphens are there), but ISBN seems to be correct; Barnes/Noble heard of it but says it's out of print; bookfinder says no one has a copy; is there a way to get it? Ah, found it at www.reiters.com, only ten bucks new, btw. They wrote other books as well, see web site.
- Cool. For years I've thought that all subjects should be taught as history and these books seem to apply that idea. I've found that it's much easier for me to remember facts about math, physics, politics, etc., if I learn about the people who discovered/invented them. My favorite kinds of math hang off of stories about Cantor, Hilbert, Turing, Russell, Whitehead and GĂ¶del.
- Subjects taught as history lost me faster than anything else. I didn't want to know about the people, or their historical context. I wanted to see the ideas and to catch a glimpse of the genius. Telling me who did what turned me off. Each to his own. I have changed with age and now find it useful, but if I had been taught maths as history I would not now be doing a subject I still love.
- Hmmm. I find I learn a word better when I know its etymology (word history) as well as its meaning and usage. I wonder if that is related?
I think it depends on what you want done or what you mean by easy. We have super computers who can calculate several complex things at once. But what is the goal, before the wanting? I think easy could be one of those relative terms.. Precisely what you need to do? ;)
If we had a GOF style MathPatternLanguage, maybe, one day, not impossibly far in the future, a little girl will pull a doll off a toy store shelf, press her tummy, and hear, "MathIsEasy!".
Math is nothing but patterns. See AbstractAlgebra
? and Calculus for 2 examples.
- I think you are using "math" in the sense that mathematicians would call "arithmetic". There is a huge gulf between doing high school algebra (which mathematicians would call "arithmetic") and proving FermatsLastTheorem or the RiemannHypothesis?. These are not "nothing but patterns".
See also:
MathIsHard