Calling in the geeks
Originally Posted by boltonblue,Sep 27 2008, 04:14 PM
ok since a few people have posted...
Mike if I recall correctly, it was just a straight 1K by 1k butterfly FFT.
same standard PC platform for both.
The java implementation was 200,000 times slower than C.
I fully expected java to be much slower but not by anywhere near that amount.
The other parts that were not benchmarked were memory use and power dissipation.
Obviously heavy floating point is involved but I think it really just died in the contructor-malloc-memcopy destructor-dealloc loops.
Mike if I recall correctly, it was just a straight 1K by 1k butterfly FFT.
same standard PC platform for both.
The java implementation was 200,000 times slower than C.
I fully expected java to be much slower but not by anywhere near that amount.
The other parts that were not benchmarked were memory use and power dissipation.
Obviously heavy floating point is involved but I think it really just died in the contructor-malloc-memcopy destructor-dealloc loops.
Originally Posted by boltonblue,Sep 28 2008, 10:48 AM
COBOL.... 
I've finally stopped retching enough to type again...
brings back memories of card stacks and big iron.

I've finally stopped retching enough to type again...

brings back memories of card stacks and big iron.
COBOL saved a lot of asses during the Y2K non event.
Originally Posted by tof,Sep 27 2008, 11:26 PM
Java sucks. How can an RTE manage to lack backward compatability with EVERY new release. And I guess it's a bit slow to boot.
Chaz, I can't believe you describe yourself as old school because you code in C. Now COBOL...or RPGII...or PL1...
Actually I can't believe C has been out there for 20 years. Where does the time go?
Chaz, I can't believe you describe yourself as old school because you code in C. Now COBOL...or RPGII...or PL1...
Actually I can't believe C has been out there for 20 years. Where does the time go?
I was using C in college in the early '80s. It's closer to 30 years old.
I actually did use Cobol, FORTRAN, even PL/1 and gobs of assemblers in my career, but C has been the one constant. It's old school when you compare it to Java or C++ or C# or whatever you like.
I like C++, by the way, but most people abuse it.
Originally Posted by Legal Bill,Sep 28 2008, 10:08 AM
Maybe that is why people use Java. The longer it takes to accomplish a task, the longer you are employed. I'm sure they will soon come out with a program language that never completes the task. It will be the new hit of the industry.
The sad thing to me is that a lot of people who learn to write software these days have little understanding of what the processor is doing underneath their work to get the job done. That's where this whole discussion hits home. I've spent my career getting better/faster/cheaper hardware out to the business world, but so much gets wasted by the application developers that it's sad. Microsoft is a big offender in this area.
Originally Posted by boltonblue,Sep 27 2008, 02:14 PM
Obviously heavy floating point is involved but I think it really just died in the contructor-malloc-memcopy destructor-dealloc loops.
Still, I wouldn't choose Java for number crunching any more than I'd select an F-150 for its nimble handling.
Originally Posted by Chazmo,Sep 28 2008, 08:24 AM
Java has the real advantage of running on anyone's processor hardware. Although, that distinction is rapidly becoming less interesting since the Intel architecture essentially dominates the world these days.
Yeah, fair enough, Traveler. My company uses Java too for a cross-platform management GUI. But, depending on what your product is actually doing (in Java), the underlying mechanism can/could be incredibly inefficient (i.e., BB's fast-Fourier example). A compiled language is usually a better-performing choice. I know you understand this.
If application developers don't understand this, then the leaps and bounds of progress in processor/memory/disk technology get wasted on inefficiency. I guess that's all I'm saying. It shouldn't take 30 seconds to open a spreadsheet on modern hardware. I don't care how "powerful" the spreadsheet/word-processor/gizmo is.
If application developers don't understand this, then the leaps and bounds of progress in processor/memory/disk technology get wasted on inefficiency. I guess that's all I'm saying. It shouldn't take 30 seconds to open a spreadsheet on modern hardware. I don't care how "powerful" the spreadsheet/word-processor/gizmo is.
For most people, computing speeds are fast enough that efficiency is just not especially important any more. That's not true of intensive engineering/physics simulations, where jobs run for days on superclusters. But for most people, processing video is the most resource-intensive thing they do.








