I have a file that I'm trying to read with Java. The file is binary set of records of fixed length, and each record is just a list of doubles. The only catch is that the file was created as output from a Fortran-77 program compiled and run on a UNIX machine, and I'm not sure how to deal with the possible difference in byte ordering. Isn't there just a readDouble() method that I can use, or will I have to do something fancier? I want the program to be platform-independent and I don't want to alter the datafiles.
java.io.DataInputStream has a readDouble() method, but it assumes the double value was created by java. I doubt it will work in a file created by Fortran. I suspect you'll have to figure out what the encoding is for Fortran, read the file a byte at a time and do your own decoding.
You have to find out what byte-ordering the UNIX machine used. If it was a Big-endian machine like a Sun SPARC or a 680x0, then you're all set, because Java uses Big-endian numbers. You could open the file with a DataInputStream and read the numbers directly.
If it was a Little-endian processor (x86) then that makes it a little harder. I think what I'd do would be to use the java.nio package, read the data into a ByteBuffer and construct a Little-endian DoubleBuffer from it.