I feel sort of gypped regarding my FORTRAN experience. I’m young enough that there were no punch cards associated with the experience, so all I got was the experience of a silly language with too many words.
That being said, flowcharting sucks. It is terrible, and I hate it.
Kagehisays
You know.. The sad thing is, I wish I had some experience with Fortran.. It might help me figure out why the heck this:
Doesn’t work when I try to convert it to VB (Heh, its the one I have a compiler for). Of course the other question is why the heck they are still using a Fortran version of it anyway anymore, when there are millions of hand held GPS units around that can do it using something more sane, like C++ or even Java. Then again, I recently read some article some place where a programmer went back to see some researchers in various places to see what they where using to code and discovered that they had “never” heard of CVS systems, IDEs, etc. In fact, they tended to code using text editors, no version tracking, an absent mindedness that left code from ten version prior in the latest compile, because no one could find the most recent changes in some case and a general feeling that if these are the people developing the newest sciences, the only reason they haven’t caused a catastrophic planetary implosion is that they can’t get the code to run in the first place to cause it. lol
quorksays
I don’t see any connection to FORTRAN, which is quite good at FORmula TRANslation. Flowcharting outlines the process of an algorithm, it does not rely on the specific syntax of any particular programming language. You could use flowcharting with newer languages like C++ or Perl, and I know some people who should.
Dustinsays
If the DI wants me to accept intelligent design, they need to harness, in a way analogous to the way we can harness physics to build things, the powers of intellignet design to build me a God-powered Final Exam Grading Engine.
Now that would be cool. Since I don’t have one, though… *picks up red pen again*
PaulCsays
The one good thing about Fortran is that it has complex numbers as an intrinsic type. You don’t need them for a lot of applications, but when you do (e.g. for FFT) it’s a pain to have to reinvent the wheel. In C++ you can at least create overloaded operators for them, but it’s just weird that they get second-class status. Real numbers aren’t even algebraically closed for crying out loud, and complex numbers, far from being esoteric, give you a natural implementation of planar transformations such as scaling, rotation, and translation.
Though my background is CS and I have a big preference for Java these days, I can understand why scientific programmers have held onto Fortran so tenaciously. It’s the only programming language that seems to have been designed by someone who studied any continuous math.
Another thing I’d like to see is an intrinstic “int mod n” type that would work with +, -, *, and / but restrict values to the range 0 through n-1. Again, you don’t need it that often, but when you do it beats pulling your hair out over the bogus not-really-a-mod-operator % that can return negative numbers.
idlemindsays
However, Fortran would be more useful for spatial transformations if it had a quaternion type.
… looks like something that was done with an ancient dot-matrix printer
HEY! i still own, use, and love my dot-matrix printer. even bought a new one when the old one finally died. i also have an ink jet, but the dot-matrix is for random stuff and code. its cartridges are cheap. of course, it’s an Epson.
Real numbers aren’t even algebraically closed for crying out loud, and complex numbers, far from being esoteric, give you a natural implementation of planar transformations such as scaling, rotation, and translation.
yeah, but if you do all your calculations in COMPLEX (whether it’s in FORTRAN or not), roundoff and similar creep gives purely real numbers gradually larger imaginary parts, even if these parts begin at zero. hence the continuing need for REAL.
Richsays
Boo!
I go to all that effort and you all g33k out on Fortran and mock my crappy formatting. Listen. I’m building on the work of the Fig Newton of Information Theory, so I can clearly sell a few books based on it and possibly get DaveScott to do a blog for me if I’m lucky.
Just don’t ask me to testify in court based in this!
It’s the only programming language that seems to have been designed by someone who studied any continuous math.
Would it be fair to say that FORTRAN is more suited to analysis, while Lisp is more suited to algebra, Paul?
PaulCsays
ekzept:
yeah, but if you do all your calculations in COMPLEX (whether it’s in FORTRAN or not), roundoff and similar creep gives purely real numbers gradually larger imaginary parts, even if these parts begin at zero. hence the continuing need for REAL.
I think reals are nice to have around, for a variety of reasons, but it’s not obvious that they should be the default.
Folks, this is a science blog. There is nothing scientific about tiresome old discussions of the relative (de)merits of various programming languages or various algorithm diagraming conventions.
PaulCsays
Oh hooey. It’s no less scientific than debating the sex of a skeletal human octopus hybrid. Some would say it’s more tiresome, but that’s a matter of taste. Anyway PZ brought up Fortran.
PaulCsays
Oops, hybrid came out when I swear typed chimera. My bad.
Would it be fair to say that FORTRAN is more suited to analysis, while Lisp is more suited to algebra, Paul?
Depends on what you mean by analysis and by algebra. If you mean numerical analysis vs. symbolic algebra you might have a point (Macsyma was originally written in Lisp, wasn’t it?)
But it’s a false dichotomy. S-expressions are good. Complex numbers are good. Both ought to be available at your fingertips in the same programming language.
Grecosays
IDists specialize in the purely imaginary.
Imaginary problems, imaginary objections, imaginary conspiracies, imaginary slanders… Sure we can’t slam them for lack of imagination.
Folks, this is a science blog. There is nothing scientific about tiresome old discussions of the relative (de)merits of various programming languages or various algorithm diagraming conventions.
putting Wolfram’s ideas aside for a moment (or forever AFAIC), most science is being heavily influenced by the availability of large scale and inexpensive computation, if not by ideas and models stolen from discussions and analyses of computation. where does science end and computation begin? what is the difference between mathematics and computation? there is one.
ekzept: putting Wolfram’s ideas aside for a moment (or forever AFAIC), most science is being heavily influenced by the availability of large scale and inexpensive computation, if not by ideas and models stolen from discussions and analyses of computation. where does science end and computation begin?
That’s precisely why I asked the question. Since my research is in modeling evolutionary concepts in my comparative anatomy system, goodness of fit between the reality and the modeling environment is always a concern. So Paul’s observation about the nature of different programming languages in relation to mathematical modeling is useful and interesting to my research, no matter whether or not the Science Police approve.
PaulC: Depends on what you mean by analysis and by algebra. If you mean numerical analysis vs. symbolic algebra you might have a point (Macsyma was originally written in Lisp, wasn’t it?). But it’s a false dichotomy. S-expressions are good. Complex numbers are good. Both ought to be available at your fingertips in the same programming language.
Yes, that was the distinction I was drawing, not intending to create a false dichotomy, but to clarify whether that was what you meant. Thanks!
archgoonsays
PaulC wrote:
Another thing I’d like to see is an intrinstic “int mod n” type that would work with +, -, *, and / but restrict values to the range 0 through n-1. Again, you don’t need it that often, but when you do it beats pulling your hair out over the bogus not-really-a-mod-operator % that can return negative numbers.
Actually, Python implements this (as well as including a complex number package). This is actually somewhat of a problem. I was working on a project and I prototyped it in python, and later moved it over to C. Since I was wrapping values around in an array, I was getting these seg faults, and I didn’t know why.
G. Tingeysays
AAARRRRGGGGHH !!!!!
I remember all this.
I’ve still got ONE old FORTRAN Punch-Card here (for old times’ sake) – you know – or don’t you?
80 characters per row, and 1 row per punch-card.
The first computer I ever used (about 1972)was old then, and actually had real Core store.
Total memory probably only about 20Kbytes …..
And made by IBM.
Were those the days?
Probably not.
Roman Werpachowskisays
You know.. The sad thing is, I wish I had some experience with Fortran.. It might help me figure out why the heck this:
Doesn’t work when I try to convert it to VB (Heh, its the one I have a compiler for). Of course the other question is why the heck they are still using a Fortran version of it anyway anymore, when there are millions of hand held GPS units around that can do it using something more sane, like C++ or even Java.
1. there are free Fortran compilers available
2. people use Fortran becasue there are loads and loads of great numerics code written in it, available for free
3. Fortran will be surely more efficient than Java.
Torbjörn Larssonsays
Complex numbers seems to become more default capability. I saw some note on 2nd gen APL with complex numbers.
“IDists specialize in the purely imaginary.”
Yes. It has turned out that irreducible complex formulations contains nothing real. They are actually irreducibly imaginary, or I^2.
Since I^2 = -1 this has a negative impact on the theory. This negative, or phase shifted, impact means of course that we have a negative feedback loop. Part of the input will be routed back to the input, precisely as Rich has hypothesed above.
It is of course nothing remarkable in that applying the new improved explanatory filter on the theory of ID itself yields the theory of ID. Garbage in means garbage out means empty theories as already FORTRAN learned us.
I escaped learning FORTRAN, though ML did screw with my head, so didn’t manage to escape all oddball languages.
What I want is a hypercomputer and a suitable hyperprogramming language …
sockatumesays
Let’s see if we can do it in BASIC.
10 $SCIENCE=0
20 INPUT $SCIENCE
30 IF $SCIENCE=$BELIEFS THEN GOTO 50
40 GOTO 10
50 PRINT “Proof of “;
60 PRINT $BELIEFS
70 GOTO 10
Can someone debug that for me? I know I should’ve used GOSUB, but I was feeling lazy.
Torbjörn Larssonsays
“Part of the input will be routed back to the input, precisely as Rich has hypothesed above.
It is of course nothing remarkable in that applying the new improved explanatory filter on the theory of ID itself yields the theory of ID.”
Ouch! A little hasty there. I mean’t to write: “Part of the output will be routed back to the input, precisely as Rich has hypothesed above.
This result is of course nothing remarkable, applying the new improved explanatory filter on the theory of ID itself should also yield the theory of ID.”
I don’t see any connection to FORTRAN, which is quite good at FORmula TRANslation. Flowcharting outlines the process of an algorithm, it does not rely on the specific syntax of any particular programming language.
Strictly speaking, you’re right. However, the arrows in a flowchart readily map to GOTOs, and GOTOs are widely acknowledged as being harmful.
I’m not saying that you couldn’t flowchart a structured, GOTO-less program, but it’d take more discipline, and the flowchart probably wouldn’t be as useful as, say, some prose notes.
Something like a flowchart might be useful for mapping out the high-level behavior of a finite state machine, or a program where users can switch from screen to screen and mode to mode without rhyme or reason, but IMHO it’s no longer a useful tool for churning out actual lines of code.
PaulCsays
arensb
Meh. So does C, these days, to say nothing of Python and a slew of other languages I’m forgetting.
As far as I know, Java does not. I haven’t paid attention to ANSI C in years, but it didn’t have a built in complex type back when I could have used one.
I may even be 10 years out of date on my whole assessment, but the inability to simply write infix +,-,*,/ expressions with complex variables has been an annoyance in the past.
I personally have not written Fortran code since college in the late 80s and don’t have a lot of use for it. There was, however, a long span of history when it was understandable why such an archaic language was hanging on in numerical applications. If that is no longer the case, so much the better.
Graculussays
Strictly speaking, you’re right. However, the arrows in a flowchart readily map to GOTOs, and GOTOs are widely acknowledged as being harmful.
Wha? Huh?
No. GOTOs are branches, and you have to branch sometime.
Where you lost marks was for “bare” branches, which was a sign of sloppy programming, and a prime cause of code bloat.
ulgsays
ekzept, RavenT, my earlier comment ‘Folks, this is a science blog …’ was intended to be humorous on the grounds that (a) unscientific discussions are nonetheless important for communities like science blogs (as PaulC immediately noted), and (b) I am not a scientist myself! (But nobody challenged me on the latter element, so I don’t get play a second round.) It’s also sour grapes resulting from many bad experiences in other forums involving discussions on the relative merits of various programming languages. Such discussions usually start out with several people making sincere efforts to help a novice choose between a bewildering variety of available tools. Then someone thoughtlessly insults the enormous amount of effort it takes to master a useful programming language that happens to have some attribute they don’t care for. Soon after, otherwise respectable students, computer scientists (including people whose peer-reviewed work was positively discussed in classes I took), and professional software developers all ended up acting like foul-mouthed schoolboys with web browsers to grind. I’m pleasantly surprised to find that my assumptions about how the discussion would evolve were wrong.
I’d try to post some kind of positive contribution, but I’m far too distracted by encounters with numerous HOPL and PPIG papers, weather forecasting and climate modeling software written in fortran, genome sequencing software written in perl, symbolic manipulation software written in lisp, condom-buffing software written in Dylan, web servers written C, web-browsers written in C++, etc, etc, etc.
Roman Werpachowski says
Now, what’s wrong with FORTRAN, eh?
Dayv says
Is there a more clear version of this chart available? This one looks like it’s been resized without much respect for the type.
PZ Myers says
I struggled with that — the original looks like something that was done with an ancient dot-matrix printer. Maybe Rich can send me a clearer image?
Bronze Dog says
Think I might try making my own version tonight or something.
CCP says
ooooh…FORTRAN….those stacks of IBM cards…that greasy paper tape…oooooooooooh
JoeB says
I feel sort of gypped regarding my FORTRAN experience. I’m young enough that there were no punch cards associated with the experience, so all I got was the experience of a silly language with too many words.
That being said, flowcharting sucks. It is terrible, and I hate it.
Kagehi says
You know.. The sad thing is, I wish I had some experience with Fortran.. It might help me figure out why the heck this:
http://www.ngs.noaa.gov/PC_PROD/UTMS/
Doesn’t work when I try to convert it to VB (Heh, its the one I have a compiler for). Of course the other question is why the heck they are still using a Fortran version of it anyway anymore, when there are millions of hand held GPS units around that can do it using something more sane, like C++ or even Java. Then again, I recently read some article some place where a programmer went back to see some researchers in various places to see what they where using to code and discovered that they had “never” heard of CVS systems, IDEs, etc. In fact, they tended to code using text editors, no version tracking, an absent mindedness that left code from ten version prior in the latest compile, because no one could find the most recent changes in some case and a general feeling that if these are the people developing the newest sciences, the only reason they haven’t caused a catastrophic planetary implosion is that they can’t get the code to run in the first place to cause it. lol
quork says
I don’t see any connection to FORTRAN, which is quite good at FORmula TRANslation. Flowcharting outlines the process of an algorithm, it does not rely on the specific syntax of any particular programming language. You could use flowcharting with newer languages like C++ or Perl, and I know some people who should.
Dustin says
If the DI wants me to accept intelligent design, they need to harness, in a way analogous to the way we can harness physics to build things, the powers of intellignet design to build me a God-powered Final Exam Grading Engine.
Now that would be cool. Since I don’t have one, though… *picks up red pen again*
PaulC says
The one good thing about Fortran is that it has complex numbers as an intrinsic type. You don’t need them for a lot of applications, but when you do (e.g. for FFT) it’s a pain to have to reinvent the wheel. In C++ you can at least create overloaded operators for them, but it’s just weird that they get second-class status. Real numbers aren’t even algebraically closed for crying out loud, and complex numbers, far from being esoteric, give you a natural implementation of planar transformations such as scaling, rotation, and translation.
Though my background is CS and I have a big preference for Java these days, I can understand why scientific programmers have held onto Fortran so tenaciously. It’s the only programming language that seems to have been designed by someone who studied any continuous math.
Another thing I’d like to see is an intrinstic “int mod n” type that would work with +, -, *, and / but restrict values to the range 0 through n-1. Again, you don’t need it that often, but when you do it beats pulling your hair out over the bogus not-really-a-mod-operator % that can return negative numbers.
idlemind says
However, Fortran would be more useful for spatial transformations if it had a quaternion type.
ekzept says
… looks like something that was done with an ancient dot-matrix printer
HEY! i still own, use, and love my dot-matrix printer. even bought a new one when the old one finally died. i also have an ink jet, but the dot-matrix is for random stuff and code. its cartridges are cheap. of course, it’s an Epson.
ekzept says
Real numbers aren’t even algebraically closed for crying out loud, and complex numbers, far from being esoteric, give you a natural implementation of planar transformations such as scaling, rotation, and translation.
yeah, but if you do all your calculations in COMPLEX (whether it’s in FORTRAN or not), roundoff and similar creep gives purely real numbers gradually larger imaginary parts, even if these parts begin at zero. hence the continuing need for REAL.
Rich says
Boo!
I go to all that effort and you all g33k out on Fortran and mock my crappy formatting. Listen. I’m building on the work of the Fig Newton of Information Theory, so I can clearly sell a few books based on it and possibly get DaveScott to do a blog for me if I’m lucky.
Just don’t ask me to testify in court based in this!
*sulks*
RavenT says
Would it be fair to say that FORTRAN is more suited to analysis, while Lisp is more suited to algebra, Paul?
PaulC says
ekzept:
I think reals are nice to have around, for a variety of reasons, but it’s not obvious that they should be the default.
Arun Gupta says
IDists specialize in the purely imaginary.
ulg says
Folks, this is a science blog. There is nothing scientific about tiresome old discussions of the relative (de)merits of various programming languages or various algorithm diagraming conventions.
PaulC says
Oh hooey. It’s no less scientific than debating the sex of a skeletal human octopus hybrid. Some would say it’s more tiresome, but that’s a matter of taste. Anyway PZ brought up Fortran.
PaulC says
Oops, hybrid came out when I swear typed chimera. My bad.
Depends on what you mean by analysis and by algebra. If you mean numerical analysis vs. symbolic algebra you might have a point (Macsyma was originally written in Lisp, wasn’t it?)
But it’s a false dichotomy. S-expressions are good. Complex numbers are good. Both ought to be available at your fingertips in the same programming language.
Greco says
IDists specialize in the purely imaginary.
Imaginary problems, imaginary objections, imaginary conspiracies, imaginary slanders… Sure we can’t slam them for lack of imagination.
ekzept says
Macsyma was originally written in Lisp, wasn’t it?
yes, in Maclisp, same thing i wrote the program for my Master’s in. see the origins of LISP-based computer algebra.
Folks, this is a science blog. There is nothing scientific about tiresome old discussions of the relative (de)merits of various programming languages or various algorithm diagraming conventions.
putting Wolfram’s ideas aside for a moment (or forever AFAIC), most science is being heavily influenced by the availability of large scale and inexpensive computation, if not by ideas and models stolen from discussions and analyses of computation. where does science end and computation begin? what is the difference between mathematics and computation? there is one.
RavenT says
Cheezit, it’s the Science Police!
That’s precisely why I asked the question. Since my research is in modeling evolutionary concepts in my comparative anatomy system, goodness of fit between the reality and the modeling environment is always a concern. So Paul’s observation about the nature of different programming languages in relation to mathematical modeling is useful and interesting to my research, no matter whether or not the Science Police approve.
Yes, that was the distinction I was drawing, not intending to create a false dichotomy, but to clarify whether that was what you meant. Thanks!
archgoon says
PaulC wrote:
Another thing I’d like to see is an intrinstic “int mod n” type that would work with +, -, *, and / but restrict values to the range 0 through n-1. Again, you don’t need it that often, but when you do it beats pulling your hair out over the bogus not-really-a-mod-operator % that can return negative numbers.
Actually, Python implements this (as well as including a complex number package). This is actually somewhat of a problem. I was working on a project and I prototyped it in python, and later moved it over to C. Since I was wrapping values around in an array, I was getting these seg faults, and I didn’t know why.
G. Tingey says
AAARRRRGGGGHH !!!!!
I remember all this.
I’ve still got ONE old FORTRAN Punch-Card here (for old times’ sake) – you know – or don’t you?
80 characters per row, and 1 row per punch-card.
The first computer I ever used (about 1972)was old then, and actually had real Core store.
Total memory probably only about 20Kbytes …..
And made by IBM.
Were those the days?
Probably not.
Roman Werpachowski says
You know.. The sad thing is, I wish I had some experience with Fortran.. It might help me figure out why the heck this:
Doesn’t work when I try to convert it to VB (Heh, its the one I have a compiler for). Of course the other question is why the heck they are still using a Fortran version of it anyway anymore, when there are millions of hand held GPS units around that can do it using something more sane, like C++ or even Java.
1. there are free Fortran compilers available
2. people use Fortran becasue there are loads and loads of great numerics code written in it, available for free
3. Fortran will be surely more efficient than Java.
Torbjörn Larsson says
Complex numbers seems to become more default capability. I saw some note on 2nd gen APL with complex numbers.
“IDists specialize in the purely imaginary.”
Yes. It has turned out that irreducible complex formulations contains nothing real. They are actually irreducibly imaginary, or I^2.
Since I^2 = -1 this has a negative impact on the theory. This negative, or phase shifted, impact means of course that we have a negative feedback loop. Part of the input will be routed back to the input, precisely as Rich has hypothesed above.
It is of course nothing remarkable in that applying the new improved explanatory filter on the theory of ID itself yields the theory of ID. Garbage in means garbage out means empty theories as already FORTRAN learned us.
Keith Douglas says
I escaped learning FORTRAN, though ML did screw with my head, so didn’t manage to escape all oddball languages.
What I want is a hypercomputer and a suitable hyperprogramming language …
sockatume says
Let’s see if we can do it in BASIC.
10 $SCIENCE=0
20 INPUT $SCIENCE
30 IF $SCIENCE=$BELIEFS THEN GOTO 50
40 GOTO 10
50 PRINT “Proof of “;
60 PRINT $BELIEFS
70 GOTO 10
Can someone debug that for me? I know I should’ve used GOSUB, but I was feeling lazy.
Torbjörn Larsson says
“Part of the input will be routed back to the input, precisely as Rich has hypothesed above.
It is of course nothing remarkable in that applying the new improved explanatory filter on the theory of ID itself yields the theory of ID.”
Ouch! A little hasty there. I mean’t to write: “Part of the output will be routed back to the input, precisely as Rich has hypothesed above.
This result is of course nothing remarkable, applying the new improved explanatory filter on the theory of ID itself should also yield the theory of ID.”
arensb says
Everything I know about Fortran, I learned from reading the Fortran Coloring Book.
arensb says
Meh. So does C, these days, to say nothing of Python and a slew of other languages I’m forgetting.
No, the one good thing about Fortran was the ability to redefine the number 2. That, and computed GOTOs.
Two good things! The two good things about Fortran are redefining integers, computed GOTOs, and self-modifying code.
Three! Three good things…
arensb says
Strictly speaking, you’re right. However, the arrows in a flowchart readily map to GOTOs, and GOTOs are widely acknowledged as being harmful.
I’m not saying that you couldn’t flowchart a structured, GOTO-less program, but it’d take more discipline, and the flowchart probably wouldn’t be as useful as, say, some prose notes.
Something like a flowchart might be useful for mapping out the high-level behavior of a finite state machine, or a program where users can switch from screen to screen and mode to mode without rhyme or reason, but IMHO it’s no longer a useful tool for churning out actual lines of code.
PaulC says
arensb
As far as I know, Java does not. I haven’t paid attention to ANSI C in years, but it didn’t have a built in complex type back when I could have used one.
I may even be 10 years out of date on my whole assessment, but the inability to simply write infix +,-,*,/ expressions with complex variables has been an annoyance in the past.
I personally have not written Fortran code since college in the late 80s and don’t have a lot of use for it. There was, however, a long span of history when it was understandable why such an archaic language was hanging on in numerical applications. If that is no longer the case, so much the better.
Graculus says
Strictly speaking, you’re right. However, the arrows in a flowchart readily map to GOTOs, and GOTOs are widely acknowledged as being harmful.
Wha? Huh?
No. GOTOs are branches, and you have to branch sometime.
Where you lost marks was for “bare” branches, which was a sign of sloppy programming, and a prime cause of code bloat.
ulg says
ekzept, RavenT, my earlier comment ‘Folks, this is a science blog …’ was intended to be humorous on the grounds that (a) unscientific discussions are nonetheless important for communities like science blogs (as PaulC immediately noted), and (b) I am not a scientist myself! (But nobody challenged me on the latter element, so I don’t get play a second round.) It’s also sour grapes resulting from many bad experiences in other forums involving discussions on the relative merits of various programming languages. Such discussions usually start out with several people making sincere efforts to help a novice choose between a bewildering variety of available tools. Then someone thoughtlessly insults the enormous amount of effort it takes to master a useful programming language that happens to have some attribute they don’t care for. Soon after, otherwise respectable students, computer scientists (including people whose peer-reviewed work was positively discussed in classes I took), and professional software developers all ended up acting like foul-mouthed schoolboys with web browsers to grind. I’m pleasantly surprised to find that my assumptions about how the discussion would evolve were wrong.
I’d try to post some kind of positive contribution, but I’m far too distracted by encounters with numerous HOPL and PPIG papers, weather forecasting and climate modeling software written in fortran, genome sequencing software written in perl, symbolic manipulation software written in lisp, condom-buffing software written in Dylan, web servers written C, web-browsers written in C++, etc, etc, etc.
RavenT says
That’s cool, ulg–and I take back my snarky comment, too.