If this is your first visit, be sure to
check out the FAQ by clicking the
link above. You may have to register
before you can post: click the register link above to proceed. To start viewing messages,
select the forum that you want to visit from the selection below.
Phoronix, Linux Hardware Reviews, Linux hardware benchmarks, Linux server benchmarks, Linux benchmarking, Desktop Linux, Linux performance, Open Source graphics, Linux How To, Ubuntu benchmarks, Ubuntu hardware, Phoronix Test Suite
As far as I'm aware most people who still use Fortran are old farts who work in university physics departments (I've personally run into a few of them and I've heard that there's a lot of them all over the world). Physics is one of the major users of high performance compute so it's only natural that this would be added at some point. Nvidia's version of GCC, the "imaginatively" named NVCC, has had support for Fortran for as long as I've known about it, so I'd say that the only thing resembling a surprise here is that Nvidia would be supporting a competing compiler.
Fortran? Really? The three people who still use that are enough to justify this?
It seems that you don't realize how important is this language in HPC/Science (earth science, climate, astronomy and many others). It has been designed quite a long time ago, but like the wheels on your car it's damn good at what is it used for: computation. This is why Intel, AMD, Nvidia are spending ressources for this language.
Back to the news, this is a good step from nvidia, PGI compilers are pretty messy unlike Intel's.
Fortran? Really? The three people who still use that are enough to justify this?
Fortran is extremely common in HPC. The language regularly receives updates, and for array programming is one of the very best options available. Intel, IBM, and Nvidia all continue to put significant resources into their fast Fortran compilers.
Having worked with some "old farts" at universities, the only real argument FOR Fortran that I have come across is the usual one for any legacy product. They don't have the time or motivation to port their code to C (or something else). The argument that Fortran is inherently better for these kind of scientific applications doesn't really convince me, as I have seen no real example that demonstrates this.
It is nice to see any kind of open source effort on this front. But in the long run, I think it would be better if the applications would be ported to a language that doesn't discourage new developers to contribute.
I've worked (2012-2014) on a military project once that used Fortran heavily. Some of the code dated back to the 80's but it's still maintained today. We had ~50 people working with the code and they were of all ages (20's - 60's).
A more concrete example of Fortran still being useful is python's scipy.odeint, a wrapper for the Fortran-written library ODEPACK. If you've ever needed to solve an ode in python you'll probably have used it.
Comment