Which language is faster?
-
I had an interesting conversation the other day and I thought I'd ask the crowd here for further enlightenment. Essentially I am supposed to take a graduate course in numerical methods starting in a week or two, so I asked about what programming would be needed and was told that I could use any language I want. Then the comment was made that a lot of engineering was still being done in FORTRAN because it was the fastest executing language. Now, I've won bets in the past by taking some fairly "fast" FORTRAN and converting it to Pascal/C/VB, etc. and showing that my algorithms run faster than the original FORTRAN. However, in most cases it was the choice of algorithm that made the difference, not the language. I never actually went back and rewrote the FORTRAN code with an algorithm change to do a real comparison, but I'm sure that it would have been faster, too. So, aside from interpreted languages, is there a real case to be made for FORTRAN being faster than other compiled languages on number crunching? Perhaps the floating point libraries are better optimized? My guess is there is not, but that individual compilers may vary some, even with the same language. I haven't programmed much FORTRAN for the last 30 years, so I'm hoping I don't need to go back and do too much of that. As far as I was concerned, discovering that there were languages other than FORTRAN was an epiphany!
CQ de W5ALT
Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software
It's the compiler that matters more than the language. Any compiler that is written specifically for a native architecture is going to do better than one that is standard or for a general architecture. Also certain compilers may work better with certain algorithms. Same for the language probably - perhaps certain algorithms work better for certain languages. In terms of language, there will be probably be little difference between C, C++ or Fortran. It will depend most on which compiler you choose and the algorithm you are writing than it will the language.
-
It depends. You haven't mentioned (a) where you're working or (b) what kind of problems you'll expect to work on or (c) how familiar you already are with various languages or (d) what tools you have available or could get approved of (maybe a budget question?). Last time I worked with FORTRAN was ~25 years ago, at university. The problem was both numerical and algebraical in nature, and FORTRAN was the standard (at universities) for everything remotely mathematical. The main reason for that, a professor once explained to me, was not that it was 'fastest' (maybe it even was for some types of problems, but the mathematicians didn't care about that all that much), but that it was 'error compatible', meaning that all universities used the same huge numerical library for all kinds of problems, and any results produced by a FORTRAN program using that library would produce the same results - and the same errors! - as the original, when being run at other universities. That was a pretty strong point for using FORTRAN at that time: reproducing scientifical results. I've worked for more than 20 years outside university and never ever came across anyone using FORTRAN. C/C++ is the industry standard for desktop applications. (or maybe C# has taken the lead in the meantime - but I doubt you'll find a lot of that in applications with a strong focus on number crunching) Web applications preferably use other languages (and I am the wrong person to ask about that), but as you are dealing with numerical stuff, and are looking for high performance, I doubt you want to deal with any of that. That said, while I do program lots of numerical and algebraical stuff at work, I do not have the luxury of a numerical library to help with the hard work. If I'm missing an algorithm, I have to program it myself. And even though many textbooks on numerics and algebra provide good descriptions of their algorithms, forging them into a program that is numerically stable even under occasionally exotic conditions isn't easy at all. So if you do have FORTRAN libraries available for that kind of stuff, but don't know where to start looking for similar libraries in other languages, you'd better stick with FORTRAN! If you want an idea what to expect when developing stable implementations for numerical algorithms, search for Jack Crenshaw and his article series on rootfinders, spread over several years! Here is a link to one of these articles
This is spot on. I did a year of Aerospace Engineering at University before switching to Computer Science. We used Fortran in Engineering because it was numerically tight. It has strict standards on numerical accuracy and replicating results independent of platform or compiler. I never used or have heard of anyone else using Fortran since that switch. Regardless of speed, if numerical accuracy matters then stick with Fortran. I have seen first hand that the same code in other languages like C/C++ can produce slightly different results depending on the compiler used because the standards are not as strict.
-
It depends. You haven't mentioned (a) where you're working or (b) what kind of problems you'll expect to work on or (c) how familiar you already are with various languages or (d) what tools you have available or could get approved of (maybe a budget question?). Last time I worked with FORTRAN was ~25 years ago, at university. The problem was both numerical and algebraical in nature, and FORTRAN was the standard (at universities) for everything remotely mathematical. The main reason for that, a professor once explained to me, was not that it was 'fastest' (maybe it even was for some types of problems, but the mathematicians didn't care about that all that much), but that it was 'error compatible', meaning that all universities used the same huge numerical library for all kinds of problems, and any results produced by a FORTRAN program using that library would produce the same results - and the same errors! - as the original, when being run at other universities. That was a pretty strong point for using FORTRAN at that time: reproducing scientifical results. I've worked for more than 20 years outside university and never ever came across anyone using FORTRAN. C/C++ is the industry standard for desktop applications. (or maybe C# has taken the lead in the meantime - but I doubt you'll find a lot of that in applications with a strong focus on number crunching) Web applications preferably use other languages (and I am the wrong person to ask about that), but as you are dealing with numerical stuff, and are looking for high performance, I doubt you want to deal with any of that. That said, while I do program lots of numerical and algebraical stuff at work, I do not have the luxury of a numerical library to help with the hard work. If I'm missing an algorithm, I have to program it myself. And even though many textbooks on numerics and algebra provide good descriptions of their algorithms, forging them into a program that is numerically stable even under occasionally exotic conditions isn't easy at all. So if you do have FORTRAN libraries available for that kind of stuff, but don't know where to start looking for similar libraries in other languages, you'd better stick with FORTRAN! If you want an idea what to expect when developing stable implementations for numerical algorithms, search for Jack Crenshaw and his article series on rootfinders, spread over several years! Here is a link to one of these articles
Thanks for the comments. I agree. My question was for a university course, and as I mentioned above, I'll use whatever language makes sense. I've programmed professionally in quite a few and made a living with FORTRAN many years ago.
CQ de W5ALT
Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software
-
Tight machine code runs the fastest. But I admit to a Cray-style highly parallel FORTRAN bias.
Some day I'll have to tell you how I brought the BCS Cray system to its knees back in the late 70's or early 80's ...
CQ de W5ALT
Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software
-
Walt, To be perfectly honest, the speed of the language has ceased to be a consideration in modern computing except for very rare cases. If you're writing logic for a low-power micro controller or operating systems level code where you have to squeeze every ounce of processor speed to ensure fast context-switching or something then you might need to consider it. I've been at this for over 35 years now. While I still believe in writing your code to be as conservative as possible with machine resources, choosing a language based on minute differences in execution speed simply doesn't make sense any more (except for the above). There are many fine languages to choose from. Pick the language based on what you want to do with it or what the opportunities you wish to pursue require. Yes, FORTRAN (I wrote it for many years) is an extremely fast language. It was originally designed that way for scientific use. If you want to learn to code for business, I'd suggest you get into .Net. Learn C# or VB.Net. I personally prefer C# now (having written C for 15 years or so) but I wrote almost nothing but VB (VB6 and VB.Net) for about 10 years. They both get the job done. If you're going to go to web development, the above (C# and VB.Net) with maybe a mixture of Javascript or PHP. The technology really has reached a point now where speed is not the #1 concern for a language any more. My 2-cents, -Max
Thanks Max. I agree. To clarify, I've been coding engineering and numerical apps for over 40 years and all my current stuff is .NET, mostly C#. I also use a double handful of other languages as needed, including Javascript and PHP. The question is concerning a class I have to take towards my engineering PhD, because I don't have graduate credit for numerical methods. :)
CQ de W5ALT
Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software
-
One of the fastest programming languages I have come across is PowerBasic (remember Borlands TurboBasic ? This is its great grand child). Powerbasic has similiar syntax to Visual Basic (non-OOP code), but the people at Powerbasic are experts in Intel machine code and in counting CPU cycles (meaning squeezing as much speed as possible). PowerBasic also has such a rich command set, there are always better ways to optimize code. In the rare instance that is not enough, you can write inline assembler code right in the middle of basic code. PowerBasic gives you the control required to optimize code to the max! The generated machine code by the compiler is probably as fast as it gets and I am confident PowerBasic would hold its own compared to any other language. I have been using PowerBasic for about 10 years now and am a developer of programming tools for use by professional programmers (who also use PowerBasic). PowerBasic allows me to write applications (and DLL's) which are smaller in size than what is generated by most languages, even C or C++ and the speed rivals the fastest C compilers. I can work with things like pointers, register variables, calling functions via a pointer, etc. The data types are so extensive there is always a better data type for the task at hand. PowerBasic IMO probably has the best string handling command set of any language and if you have to do text parsing I doubt it could be beat by any language. I wrote a 2D Sprite engine (100% software based with no special hardware required) using Powerbasic which requires extremely fast manipulation of millions of pixels in DIB sections and it can fly, even on a slower CPU like an Intel Atom found in many Netbooks today. I actually do testing on older PC's like a Windows 95 PC with a 500 mhz CPU (or less). I like to see my software fly on even a legacy PC, so it will be super fast on the latest PC's. Many programming languages couldn't even be used on a 500 mhz Windows 95 PC (PowerBasic can) and they surely would not be used to write software for such a legacy PC (too slow). Since I write tools for programmers, I have to be concerned about speed and PowerBasic has always matched my needs. I am currently developing the next generation of my tools which also handle 3D drawing using OpenGL and my OpenGL Canvas control (yes a real Windows control) has excellent speed in translating a GL scripting Graphics Language the control provides for 3D drawing. The control must interpret a script language and then handle all the 3D drawing via OpenGL
Thanks for the info. I'm not enthused by BASIC, but I've used it fairly extensively, including old TurboBasic. I'll check into PowerBasic.
CQ de W5ALT
Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software
-
I had an interesting conversation the other day and I thought I'd ask the crowd here for further enlightenment. Essentially I am supposed to take a graduate course in numerical methods starting in a week or two, so I asked about what programming would be needed and was told that I could use any language I want. Then the comment was made that a lot of engineering was still being done in FORTRAN because it was the fastest executing language. Now, I've won bets in the past by taking some fairly "fast" FORTRAN and converting it to Pascal/C/VB, etc. and showing that my algorithms run faster than the original FORTRAN. However, in most cases it was the choice of algorithm that made the difference, not the language. I never actually went back and rewrote the FORTRAN code with an algorithm change to do a real comparison, but I'm sure that it would have been faster, too. So, aside from interpreted languages, is there a real case to be made for FORTRAN being faster than other compiled languages on number crunching? Perhaps the floating point libraries are better optimized? My guess is there is not, but that individual compilers may vary some, even with the same language. I haven't programmed much FORTRAN for the last 30 years, so I'm hoping I don't need to go back and do too much of that. As far as I was concerned, discovering that there were languages other than FORTRAN was an epiphany!
CQ de W5ALT
Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software
I used FORTRAN in the 70's for a numerical analysis course, and have used other compiled languages since. I have also coded in both mainframe and micro assembler languages. From what I remember, FORTRAN does have at least one advantage for numerical analysis: when you declare your numeric variables, you can exactly specify the precision of the number, and can depend on being able to correctly state the number of significant digits of your calculations. I also know that when you specify the precision of the number, this is directly translated into mainframe machine code instructions especially tailored to the precision you specify, making it very efficient. On non-mainframe computers, this may not be the case. In C/C++ and Pascal, you have to largely depend on the compiler and platform, other than simply specifying int, long, etc., and may get slightly different results depending on which you use. So, I would recommend FORTRAN for numerical analysis, unless you think that future jobs will be requiring you to do that kind of work in a different language, then use that. Regarding assembler, even with compiler optimizations, direct coding in assembler can be much more efficient, but I would never recommend that to anyone, unless you plan on programming micro-controllers where speed is a top requirement.
-
Gary R. Wheeler wrote:
Modern compilers are sufficiently sophisticated in the optimizations they perform, and broad in the scope at which they're applied, that I doubt any human programmer could achieve equivalent or better results except in a small number of cases.
Your having a laugh! Compare a compiled program's size to the the same program written in assembly language and i'll put money on it that it's at least 4 to 5 times the size. So all those extra, superfluous instructions are just wasting processor time in the compiled version.
Nobody can get the truth out of me because even I don't know what it is. I keep myself in a constant state of utter confusion. - Col. Flagg
I think this is the Key.. If you look at what constitutes "best practices" in Object oriented languages and what you need to do to get those same languages to "run fast" is break most of the "best practices" because of the way the compiler MUST read the origional code, you basically shoot yourself in the foot... and so when we attempt to compare "Good Code" to "Fast Code" we rarely find a Venn area of overlap... take the number of people who know that switch is faster then if ..else if...else if.. and the prevelance of Properties in all code.... as well as the "allowed" overhad to run two equivelant apps where WPF over WinForms is used on a client machine, then talk to me about optimization... :doh: .. ohh and fortran uses GoTo.... :-\
I'd blame it on the Brain farts.. But let's be honest, it really is more like a Methane factory between my ears some days then it is anything else...
-----
"The conversations he was having with himself were becoming ominous."-.. On the radio... -
I used FORTRAN in the 70's for a numerical analysis course, and have used other compiled languages since. I have also coded in both mainframe and micro assembler languages. From what I remember, FORTRAN does have at least one advantage for numerical analysis: when you declare your numeric variables, you can exactly specify the precision of the number, and can depend on being able to correctly state the number of significant digits of your calculations. I also know that when you specify the precision of the number, this is directly translated into mainframe machine code instructions especially tailored to the precision you specify, making it very efficient. On non-mainframe computers, this may not be the case. In C/C++ and Pascal, you have to largely depend on the compiler and platform, other than simply specifying int, long, etc., and may get slightly different results depending on which you use. So, I would recommend FORTRAN for numerical analysis, unless you think that future jobs will be requiring you to do that kind of work in a different language, then use that. Regarding assembler, even with compiler optimizations, direct coding in assembler can be much more efficient, but I would never recommend that to anyone, unless you plan on programming micro-controllers where speed is a top requirement.
Yes, I agree, and for th class I will probably use FORTRAN if the profs want me to and if I can get a reasonable compiler. Most of my current work is in C#, though.
CQ de W5ALT
Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software
-
Thanks for the info. I'm not enthused by BASIC, but I've used it fairly extensively, including old TurboBasic. I'll check into PowerBasic.
CQ de W5ALT
Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software
PowerBasic, IMO is one of the software industries best kept secrets. Most online computer and developer magazines rarely ever mention it. I have been using Powerbasic for nearly 10 years now and I can write code which will run circles around stuff written by C++ or VB# programmers. I write a good bit of code which requires optimal speed, such as things like 2D animation, 3D OpenGL animation, Image Filters, etc. I write a lot of stuff that deals with Graphics (ie. customizing controls via Ownerdraw, my own Graphics controls, etc.) and speed is critical. I can write applications or DLL's which rival the speed of any language and still write stuff that will fit on a Floppy Disk or will run on Legacy PC's like Windows 95/98 (with CPU's below 500 mhz). Now my experience as a Windows API programmer gives me some advantages, but PowerBasic allows me to tap into the Windows API with an extremely fast compiler.
-
I had an interesting conversation the other day and I thought I'd ask the crowd here for further enlightenment. Essentially I am supposed to take a graduate course in numerical methods starting in a week or two, so I asked about what programming would be needed and was told that I could use any language I want. Then the comment was made that a lot of engineering was still being done in FORTRAN because it was the fastest executing language. Now, I've won bets in the past by taking some fairly "fast" FORTRAN and converting it to Pascal/C/VB, etc. and showing that my algorithms run faster than the original FORTRAN. However, in most cases it was the choice of algorithm that made the difference, not the language. I never actually went back and rewrote the FORTRAN code with an algorithm change to do a real comparison, but I'm sure that it would have been faster, too. So, aside from interpreted languages, is there a real case to be made for FORTRAN being faster than other compiled languages on number crunching? Perhaps the floating point libraries are better optimized? My guess is there is not, but that individual compilers may vary some, even with the same language. I haven't programmed much FORTRAN for the last 30 years, so I'm hoping I don't need to go back and do too much of that. As far as I was concerned, discovering that there were languages other than FORTRAN was an epiphany!
CQ de W5ALT
Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software
-
Thanks for the info. I'm not enthused by BASIC, but I've used it fairly extensively, including old TurboBasic. I'll check into PowerBasic.
CQ de W5ALT
Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software
-
I had an interesting conversation the other day and I thought I'd ask the crowd here for further enlightenment. Essentially I am supposed to take a graduate course in numerical methods starting in a week or two, so I asked about what programming would be needed and was told that I could use any language I want. Then the comment was made that a lot of engineering was still being done in FORTRAN because it was the fastest executing language. Now, I've won bets in the past by taking some fairly "fast" FORTRAN and converting it to Pascal/C/VB, etc. and showing that my algorithms run faster than the original FORTRAN. However, in most cases it was the choice of algorithm that made the difference, not the language. I never actually went back and rewrote the FORTRAN code with an algorithm change to do a real comparison, but I'm sure that it would have been faster, too. So, aside from interpreted languages, is there a real case to be made for FORTRAN being faster than other compiled languages on number crunching? Perhaps the floating point libraries are better optimized? My guess is there is not, but that individual compilers may vary some, even with the same language. I haven't programmed much FORTRAN for the last 30 years, so I'm hoping I don't need to go back and do too much of that. As far as I was concerned, discovering that there were languages other than FORTRAN was an epiphany!
CQ de W5ALT
Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software
Ask yourself the same subject with human languages instead of computer languages. Compared to English, Chinese sounds fast and French is backwards. Spanish and Italian are close as are Russian and Ukrainian. Language in any book (compiled) is faster than real time translation (interpreted). Working in human terms (base 10) involves translation from electronic calculations (base 2 or binary). Numeric computations are subject to formulae and translated results are very subjective. Your code will probably reuse math functions from libraries. Unless one writes the entire codebase from scratch, they are scaffolding on other interpretations, perhaps in other languages which may or may not affect overall speed. Which human language crunches numbers fastest and which human crunches numbers fastest are two very different things. You really need to specify the task to quantify any performance. Lastly, Fortran may be an excellent opportunity for personal understanding, but it is not as easy to implement and will hardly pay your bills any faster.
Dwayne J. Baldwin
-
BASIC is not fast. With BBC Basic you could include machine code in your BASIC program to speed up your application.
Quote: "BASIC is not fast" Thats what you may have been told, but I assure you Powerbasic is not only fast, but it can stand its own with most any other compiler, even C. The misnomer that Basic is slow and a poor language dates back to the time when many basics were interpreted. A compiled Basic can be just as fast and as any other language and there is nothing inherent to the Basic language which would make it hard to compile to a fast executable. PowerBasic (the company) are experts at optimizing machine code generated by the compiler. For example let's just take a look at a simple bit of code:
REGISTER x AS LONG, y AS LONG, z AS LONG LOCAL t AS DOUBLE LOCAL i AS LONG x = 5 y = 3 t = TIMER FOR i = 1 TO 10000000 IF x > 1000000 THEN x=5 x = x \* y NEXT t = TIMER - t MSGBOX FORMAT$(t,".0000")
This simple FOR NEXT loop executes 10 million times, doing a simple multiplication calculation. Now the PowerBasic compiler does not cheat by generating short cuts in code it may assume does nothing, like other compilers do (others do that just for benchmarking to make their compiler look good). This FOR NEXT loop is executing every iteration. Now how fast does 10 million iterations take on a common Desktop running Windows XP with a mass produced 2.5 ghz Celeron CPU (no cores or multiple CPU's) ? .094 seconds ! Yes, the PowerBasic compiler is generate code for the above code which executes 10 million iterations in less than 1/10th a second. Now I don't think that is slow. Sure, some compiler out there may beat that, but PowerBasic can hold its own quite well. Now add to this the richness of the language so you can write better code (to make it faster) and the ability to add inline assembler when you want to and you have a language which can create amazingly fast executables. I should know. I have been using it commercially writing tools for programmers for about 10 years.
-
I think this is the Key.. If you look at what constitutes "best practices" in Object oriented languages and what you need to do to get those same languages to "run fast" is break most of the "best practices" because of the way the compiler MUST read the origional code, you basically shoot yourself in the foot... and so when we attempt to compare "Good Code" to "Fast Code" we rarely find a Venn area of overlap... take the number of people who know that switch is faster then if ..else if...else if.. and the prevelance of Properties in all code.... as well as the "allowed" overhad to run two equivelant apps where WPF over WinForms is used on a client machine, then talk to me about optimization... :doh: .. ohh and fortran uses GoTo.... :-\
I'd blame it on the Brain farts.. But let's be honest, it really is more like a Methane factory between my ears some days then it is anything else...
-----
"The conversations he was having with himself were becoming ominous."-.. On the radio... -
ely_bob wrote:
.. ohh and fortran uses GoTo....
Which is what makes it fast. No loading up the stack/heap and saving registers for performing those nasty subroutines. :laugh:
Gary
Like I said, bad code practices, good performance...... But I figured it would get to at least someone on here.... :laugh: :-D :laugh: :-D :laugh: :-D :laugh: :-D
I'd blame it on the Brain farts.. But let's be honest, it really is more like a Methane factory between my ears some days then it is anything else...
-----
"The conversations he was having with himself were becoming ominous."-.. On the radio... -
Quote: "BASIC is not fast" Thats what you may have been told, but I assure you Powerbasic is not only fast, but it can stand its own with most any other compiler, even C. The misnomer that Basic is slow and a poor language dates back to the time when many basics were interpreted. A compiled Basic can be just as fast and as any other language and there is nothing inherent to the Basic language which would make it hard to compile to a fast executable. PowerBasic (the company) are experts at optimizing machine code generated by the compiler. For example let's just take a look at a simple bit of code:
REGISTER x AS LONG, y AS LONG, z AS LONG LOCAL t AS DOUBLE LOCAL i AS LONG x = 5 y = 3 t = TIMER FOR i = 1 TO 10000000 IF x > 1000000 THEN x=5 x = x \* y NEXT t = TIMER - t MSGBOX FORMAT$(t,".0000")
This simple FOR NEXT loop executes 10 million times, doing a simple multiplication calculation. Now the PowerBasic compiler does not cheat by generating short cuts in code it may assume does nothing, like other compilers do (others do that just for benchmarking to make their compiler look good). This FOR NEXT loop is executing every iteration. Now how fast does 10 million iterations take on a common Desktop running Windows XP with a mass produced 2.5 ghz Celeron CPU (no cores or multiple CPU's) ? .094 seconds ! Yes, the PowerBasic compiler is generate code for the above code which executes 10 million iterations in less than 1/10th a second. Now I don't think that is slow. Sure, some compiler out there may beat that, but PowerBasic can hold its own quite well. Now add to this the richness of the language so you can write better code (to make it faster) and the ability to add inline assembler when you want to and you have a language which can create amazingly fast executables. I should know. I have been using it commercially writing tools for programmers for about 10 years.
FYI - Same code in C# returns 0 MS.
int x = 1; int y = 1; DateTime t = DateTime.Now; for (int i = 0; i < 10000000; i++) { x = x \* y ; } double d = DateTime.Now.Subtract( t).TotalMilliseconds; MessageBox.Show(d.ToString());
I have to change the loop count to 100 Million to get a time of 31millieconds back. (I was actually expecting the c# compiler to omit the entire loop. I'm surprised that it didn't) YS
-
I had an interesting conversation the other day and I thought I'd ask the crowd here for further enlightenment. Essentially I am supposed to take a graduate course in numerical methods starting in a week or two, so I asked about what programming would be needed and was told that I could use any language I want. Then the comment was made that a lot of engineering was still being done in FORTRAN because it was the fastest executing language. Now, I've won bets in the past by taking some fairly "fast" FORTRAN and converting it to Pascal/C/VB, etc. and showing that my algorithms run faster than the original FORTRAN. However, in most cases it was the choice of algorithm that made the difference, not the language. I never actually went back and rewrote the FORTRAN code with an algorithm change to do a real comparison, but I'm sure that it would have been faster, too. So, aside from interpreted languages, is there a real case to be made for FORTRAN being faster than other compiled languages on number crunching? Perhaps the floating point libraries are better optimized? My guess is there is not, but that individual compilers may vary some, even with the same language. I haven't programmed much FORTRAN for the last 30 years, so I'm hoping I don't need to go back and do too much of that. As far as I was concerned, discovering that there were languages other than FORTRAN was an epiphany!
CQ de W5ALT
Walt Fair, Jr., P. E. Comport Computing Specializing in Technical Engineering Software
Walt Fair, Jr. wrote:
So, aside from interpreted languages, is there a real case to be made for FORTRAN being faster than other compiled languages on number crunching?
No. What matters is the compiled machine code, not the source code used to generate it. C/C++ are easily every bit as good as controlling that as Fortran or Pascal. If you want real performance, you code the innermost loop so it fits entirely in the instruction cache of the CPU and minimizes data fetching and saving. By definition, you can't achieve that with an interpreted language, which is why they are not good choices for ultimate speed. You won't get extra points for implementing the assignments in the professor's favoriate language, but you will loose points for not getting the assignment done. Use whatever natively-compiling language you're happiest with.
patbob
-
FYI - Same code in C# returns 0 MS.
int x = 1; int y = 1; DateTime t = DateTime.Now; for (int i = 0; i < 10000000; i++) { x = x \* y ; } double d = DateTime.Now.Subtract( t).TotalMilliseconds; MessageBox.Show(d.ToString());
I have to change the loop count to 100 Million to get a time of 31millieconds back. (I was actually expecting the c# compiler to omit the entire loop. I'm surprised that it didn't) YS
I posted the example code to demonstrate the Powerbasic compiler is fast, not to use it as a benchmark to compare it to another compiler. The problem with such comparisons is that some compilers use shortcuts to speed up such simple loops to make the compiler look faster than it really is. PowerBasic never does this. Without being able to examine using a disassembler to see what code the compiler is actually generating it is difficult to tell whether it is accurate. Also one has to be careful in the objects used for such small timing, to make sure they are accurate done to the millisecond level. The only way to truly compare two language is to make sure the code is not effected by any "tricks" of a compiler and the timer must be a precision timer to be completely accurate. Also you have to run such a benchmark example, using both languages on the exact same computer. There is a big difference for example between my 7 year old 2.5 ghz Intel Celeron CPU (and even the RAM speed) compared to many of the much faster CPU's likely found in the average programmers PC. If you don't take into consideration such factors you can't make any comparisons.
modified on Monday, January 10, 2011 7:46 PM
-
I'm going to have to disagree here. Einstein @ Home[^] got an ~2x speedup in their science application* when they replaced their C++ hotloop with assembly. The same person previously provided a ~4x speedup by reworking the C++ algorithm to work better within the number of CPU pipelines and cache sizes. * The gravitational wave one anyway, I think the binary radio pulsar app is still in too much flux for them to be working on an assembler version; they only did it for prior apps once they were certain everything worked right and was stable.
3x12=36 2x12=24 1x12=12 0x12=18
The examples you cite meet my "small number of cases" criteria. I would imagine the algorithms involved have been finely tuned. The only remaining avenue to improve performance is to better align the algorithm with the hardware ro take maximum advantage. Those are cases where the benefit of assembly language optimization exceed the cost. My argument is that, most of the time, that's not true. Assembly language optimization increases the maintenance cost of that portion of the application by an order of magnitude or more. Development is slow and painful. Faults in the assembly language implementation often crash the application, sometimes the entire system, with no trail of bread crumbs to follow, making debugging difficult. If you are using special processor instructions (often the sole justification for the whole thing), you are subject to errata in the implementation and the documentation. You often need to know hardware details that are proprietary. With the CISC architectures, I'm also doubtful a human programmer could adequately map a complex algorithm to an optimal configuration of pipelines, cache sizes, and so on, taking into account branch predictor behavior and the like. I've done that sort of thing in the past with fairly simple processes, and the results weren't awe-inspiring considering the amount of work that went into it.
Software Zen:
delete this;
Fold With Us![^]