aspose file tools*
The moose likes Beginning Java and the fly likes Portability - Query on design Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Java » Beginning Java
Bookmark "Portability - Query on design" Watch "Portability - Query on design" New topic
Author

Portability - Query on design

Praveen Kumar M K
Ranch Hand

Joined: Jul 03, 2011
Posts: 256
I was going through the Java Language Environment white paper and had a question regarding portability. Please help.

Based on standards, one can write programs in Java assuming a 32-bit machine with 8-bit/byte and IEEE754 floating-point math, in the background. This standardization is one of the reasons which makes a Java program, portable. All the primitive data types have defined sizes and based on the size, have defined value ranges.

My question - In case I'm working on a 64-bit machine, although the machine can support (2^64 - 1) to (-2^64) integral values, I cant utilize all of that since "int" is 4 bytes only in Java. Is that right? If so, atleast in higher versions of Java why was this not changed?(likewise with a similar change maybe even floating points could've had more precision)
Jeff Verdegan
Bartender

Joined: Jan 03, 2004
Posts: 6109
    
    6

Praveen Kumar M K wrote:
My question - In case I'm working on a 64-bit machine, although the machine can support (2^64 - 1) to (-2^64) integral values, I cant utilize all of that since "int" is 4 bytes only in Java. Is that right? If so, atleast in higher versions of Java why was this not changed?


For portability. If you have a Java int, you know it's 4 bytes, and you know its range, and you don't have to worry about the details of the underlying hardware.

If you want 64 bits, you can use long and double.
Praveen Kumar M K
Ranch Hand

Joined: Jul 03, 2011
Posts: 256
But wouldnt it be better to upgrade? I mean currently, whether its a 32-bit system or a high-end system the precision of calculations is currently still the same, am sure there is loss of accuracy.

I'm guessing that backward compatibility would be an issue - If JLS were to redefine int as 8 bytes in a new major version, then programs/JVM of the older version wouldn't probably work...but am sure that this cant be the only reason to hold back.

Anayonkar Shivalkar
Bartender

Joined: Dec 08, 2010
Posts: 1509
    
    5

Praveen Kumar M K wrote:am sure there is loss of accuracy

Well, I guess we already have an answer for this:
Jeff Verdegan wrote:you can use long and double


Praveen Kumar M K wrote:but am sure that this cant be the only reason to hold back

Well, I'm also not sure about other reasons, but taking into consideration how much emphasis JCP puts on backward compatibility, this reason itself is enough for not implementing 8 byte int. Yes, in other languages, we do have exclusive data-types for 64 bit architectures, but universal class file is something unique to Java. Wouldn't it look bad if there are any architecture specific data-types in Java, someone starts using them, and then in the documentation writes something like : please do not run on 32 bit systems. It will kill Java's platform independence.

Correct me if I'm wrong, but I don't think architecture specific (e.g. 64 bit int) data-type would or should be part of Java.


Regards,
Anayonkar Shivalkar (SCJP, SCWCD, OCMJD, OCEEJBD)
Campbell Ritchie
Sheriff

Joined: Oct 13, 2005
Posts: 39791
    
  28
Praveen Kumar M K wrote: . . . assuming a 32-bit machine with 8-bit/byte and IEEE754 floating-point math, . . .
No, you do not assume 32 bits. You might use 37 bits, or 21 bits, or 44 bits, for all the JVM cares. Portability means you load a 32 bit JRE/JDK or a 64 bit JDK/JRE, and then forget all about how many bits. And when Java™ came out about 16 years ago, there were still people using 16 bit computers, who couldn’t perceive any difference from a 32 bit machine.
Jeff Verdegan
Bartender

Joined: Jan 03, 2004
Posts: 6109
    
    6

Praveen Kumar M K wrote:But wouldnt it be better to upgrade?


It depends on one's definition of "better." If taking advantage of the latest and greatest is the most important thing, then yes. But if portability is important, then no. Remember, portability and consistency are two of Java's key goals. If different versions have different sizes for int, that breaks. There are still a lot of 32-bit systems out there, and if you write your Java code using 64-bit ints, then it won't work correctly on my 32-bit system.

I mean currently, whether its a 32-bit system or a high-end system the precision of calculations is currently still the same, am sure there is loss of accuracy.


For floating point, you should have been using double all along anyway, not float, so no change there. And for integers, if your code needs more than the range of int, then long has always been there, so no change there either.

You seem to be missing the point that 64-bit integer and floating point values have always been avaiable in Java. The only difference is that now you may have hardware that can deal with them as single words. The beauty of it is, even old Java .class files can take advantage of that without any recompiling or anything.

I'm guessing that backward compatibility would be an issue - If JLS were to redefine int as 8 bytes in a new major version, then programs/JVM of the older version wouldn't probably work...but am sure that this cant be the only reason to hold back.


That may or may not be the only reason, but it's the only one that pops to mind, and it's a bloody good one. Good enough that there doesn't need to be another reason.
James Boswell
Bartender

Joined: Nov 09, 2011
Posts: 1030
    
    5

If JLS were to redefine int as 8 bytes in a new major version, then programs/JVM of the older version wouldn't probably work...but am sure that this cant be the only reason to hold back.


That is a bit of a show stopper reason don't you think!
Praveen Kumar M K
Ranch Hand

Joined: Jul 03, 2011
Posts: 256
Not completely

Every new version any language, let alone Java, comes up with newer features and gets rid of the older not-so-useful ones. You can never have full backward compatibility. Example - the inbuilt functions of Date class in java.util are currently deprecated(am using JDK/JRE 6) and in future versions will probably be unsupported. So, a program written using these functions wouldnt work in future versions and one would have to re-write the programs.

HOLD ON THERE, I need a clarification - In Java parlance, are portability and backward compatibility defined only in the confines of the basic constructs? Constructs that were created back during Java 1.1? I mean, util is just a package, if I had written a program without the functions of this package during 1.1, it should work in the current version.
Jeff Verdegan
Bartender

Joined: Jan 03, 2004
Posts: 6109
    
    6

Praveen Kumar M K wrote:Not completely

Every new version any language, let alone Java, comes up with newer features and gets rid of the older not-so-useful ones.


That's not the same as changing a very fundamental aspect of the language that was put in place to meet one of the language's primary goals, and when there "newer feature" has always been available anyway.

HOLD ON THERE, I need a clarification - In Java parlance, are portability and backward compatibility defined only in the confines of the basic constructs? Constructs that were created back during Java 1.1? I mean, util is just a package, if I had written a program without the functions of this package during 1.1, it should work in the current version.


This is not a court of law, and it's not always simple or straightforward. Portability and backward compatibility are both central goals of Java, and always have been. But very little is carved in stone. Pretty much anything is a valid target to be examined for possible updating or removal with newer versions, and the big picture is taken into account when the decision is made. However, there's a huge difference between deprecating methods in the Date class in the API and changing the format of a fundamental data type in the language.

There's no real benefit to making int and float 64 bits, and there's a huge drawback to doing so.
Praveen Kumar M K
Ranch Hand

Joined: Jul 03, 2011
Posts: 256
About the 32-bit and 64-bit query, my thought process is like this - The hardware has advanced over the years. Even a normal respectable PC configurations today were thing-of-the-mighty in the past. We might come to replace every 32-bit with a 64-bit and every 64-bit with a 128-bit thereby putting more information into each "word". Even then using the 4 bytes of int(or the 8 bytes of double) we would still be playing around with data that is restricted. Double.MaxValue is the greatest number that I can define no matter how big the background system is.(Please correct me if there is a work around).

I agree however that the same language has to cater to an 8-bit microprocessor too, but then, until when?

Jeff : However, there's a huge difference between deprecating methods in the Date class in the API and changing the format of a fundamental data type in the language.


Yes, that was presumptuous of me to think that way.
Jeff Verdegan
Bartender

Joined: Jan 03, 2004
Posts: 6109
    
    6

Praveen Kumar M K wrote:About the 32-bit and 64-bit query, my thought process is like this - The hardware has advanced over the years. Even a normal respectable PC configurations today were thing-of-the-mighty in the past. We might come to replace every 32-bit with a 64-bit and every 64-bit with a 128-bit thereby putting more information into each "word".


There are already 64-bit primitive types in Java, and there always have been. From the beginning, if we needed an integer outside -2^31..2^31-1, we always had long. And if we needed more precision than float (which you pretty much always do), we always had double. So changing int and float will not help anything. It may be that we're "wasting" bytes because the JVM is allowed to use 64 hardware bits for an int, even though the int can only ever use 32 of them, but so what? Making int 64 bits won't fix that. If we were using int, it's because we only needed 32 bits in the first place.

In 10 years if 128-bit hardware becomes commonplace, I could theoretically see java adding a "huge" integer type and a "quadruple" floating point type, but there would still be no benefit to changing the sizes of the existing types.

Double.MaxValue is the greatest number that I can define no matter how big the background system is.(Please correct me if there is a work around).


If you need values outside of double's range, use BigDecimal.

I agree however that the same language has to cater to an 8-bit microprocessor too, but then, until when?


That really has nothing to do with it. We can use a 64-bit Java long on an 8-bit CPU, because the JVM abstracts away those details.

You seem to be missing this key point: There's no real relation between the size of a Java type and the word size on the underlying hardware, except that if a Java value is larger than the word size, then the JVM has to take extra steps to work with that value as a single unit.

Nobody ever "settled for" a 32-bit int in Java because a 64-bit long wouldn't work on his 32-bit hardware. It's always been, "If you need 64 bits, use a long. If you don't, use an int."

Now, having said all that, what could change is that things like array indices and references and lookups and offsets in the class file and JVM could conceivably grow from 32-bit int size to 64-bit long size. That would be a big change, and I don't see it happening soon (how often do you really need more than 2 billion elements in an array?), but doing so would be somewhat less drastic than changing int's size.
Praveen Kumar M K
Ranch Hand

Joined: Jul 03, 2011
Posts: 256
Jeff Verdegan wrote:
Nobody ever "settled for" a 32-bit int in Java because a 64-bit long wouldn't work on his 32-bit hardware. It's always been, "If you need 64 bits, use a long. If you don't, use an int."


This is the thing that I missed or let me say misunderstood! I went through a very old javaworld article and I'll quote verbatim -
Unfortunately, the features that make Java so portable have a downside. Java assumes a 32-bit machine with 8-bit bytes and IEEE754 floating-point math. Machines that don't fit this model, including 8-bit microcontrollers and Cray supercomputers, can't run Java efficiently. For this reason, we should expect C and C++ to be used on more platforms than the Java language. We also should expect Java programs to port easier than C or C++ between those platforms that do support both.

This made me think if an efficiency problem is already known, why was it not changed.
Campbell Ritchie
Sheriff

Joined: Oct 13, 2005
Posts: 39791
    
  28
All sorts of things have changed in the 15 years since that article appeared.
Jeff Verdegan
Bartender

Joined: Jan 03, 2004
Posts: 6109
    
    6

Praveen Kumar M K wrote:
Jeff Verdegan wrote:
Nobody ever "settled for" a 32-bit int in Java because a 64-bit long wouldn't work on his 32-bit hardware. It's always been, "If you need 64 bits, use a long. If you don't, use an int."


This is the thing that I missed or let me say misunderstood! I went through a very old javaworld article and I'll quote verbatim -
Unfortunately, the features that make Java so portable have a downside. Java assumes a 32-bit machine with 8-bit bytes and IEEE754 floating-point math. Machines that don't fit this model, including 8-bit microcontrollers and Cray supercomputers, can't run Java efficiently. For this reason, we should expect C and C++ to be used on more platforms than the Java language. We also should expect Java programs to port easier than C or C++ between those platforms that do support both.

This made me think if an efficiency problem is already known, why was it not changed.


Because they deliberately made the choice to sacrifice some speed in favor of portability.

Note also that this is a very old paper, and it's talking about running 32-bit Java on 8-bit hardware. The inefficiency of using 64-bit longs and doubles on modern 32-bit hardware with a modern JVM is much, much less pronounced.

AND it's still a completely different point: That article says that using wider types on narrower-word hardware leads to inefficiency. It does NOT follow that changing int to 64 bits will now give greater efficiency.

From the beginning, we could always use 8-bit bytes, 16-bit chars and shorts, 32-bit ints and floats, and 64-bit longs and doubles on any hardware, whether 8-bit, 16-bit, 32-bit or 64-bit. It would always work. If we were smart, we would pick the type best suited to our needs, and everything would work, but if we were running on hardware with a narrower word width than that type, then, yes, it would be slower. But that's not just a Java thing. If we pick a type that's wider than the word width of our hardware in any language it will slow things down.

Now, here's where things get really good: If you're using a 64-bit long in a 32-bit JVM on 32-bit hardware, it will be a bit slower. But tomorrow, if you go out and buy a new 64-bit computer, and you install 64-bit Java on it, and you use the exact same class files, without recompiling on that new setup, suddenly, your 64-bit longs--which worked fine before but were a bit slower--are still working fine, and are running faster, due to the 64-bit word width of the hardware and the JVM that takes advantage of it.

Portability. Rock on.
Henry Wong
author
Sheriff

Joined: Sep 28, 2004
Posts: 18978
    
  40

Praveen Kumar M K wrote:This is the thing that I missed or let me say misunderstood! I went through a very old javaworld article and I'll quote verbatim -
Unfortunately, the features that make Java so portable have a downside. Java assumes a 32-bit machine with 8-bit bytes and IEEE754 floating-point math. Machines that don't fit this model, including 8-bit microcontrollers and Cray supercomputers, can't run Java efficiently. For this reason, we should expect C and C++ to be used on more platforms than the Java language. We also should expect Java programs to port easier than C or C++ between those platforms that do support both.

This made me think if an efficiency problem is already known, why was it not changed.



Having worked on a few 8 bits machines, and on a couple of Cray supercomputers, I think I will argue that the point in the article is heavily flawed. Why? ....

For 8-bit machines... In general (for the few processors that I have worked on), the bit width applies to everything. The register width, the memory width, memory addressing, etc. This means that that the maximum memory on these machines are really small. This is probably the main reason why there are no J2SE JVMs on a 8 bit machine.

For the Cray supercomputer ... On the two Cray supercomputers that I worked on, the bit width was 64 bits. Maybe they are longer bit width processors, but certainly none of the Cray's that I worked on. Don't know the point the author is trying to make, but the implication seems to be to provide an example of a higher bit width machine.

Henry

Books: Java Threads, 3rd Edition, Jini in a Nutshell, and Java Gems (contributor)
Henry Wong
author
Sheriff

Joined: Sep 28, 2004
Posts: 18978
    
  40

Campbell Ritchie wrote:All sorts of things have changed in the 15 years since that article appeared.


15 years ago, I wrote articles for the Java Report. And I have to say that I am so glad that these articles are no longer accessible. It is amazing how quickly things go absolete, to the point where I would be embarrassed with having them accessible.

Henry
Praveen Kumar M K
Ranch Hand

Joined: Jul 03, 2011
Posts: 256
Campbell Ritchie wrote:All sorts of things have changed in the 15 years since that article appeared.


15 years hence, this is the first hit on Google for "Java + portability"

But yeah, I get what you mean. The only systems that I've ever worked on are 32-bit ones and I've never even seen a download link for the older Java versions, so I can only take these articles at face value!

Thank you everyone, for your profound explanations. Makes me think the amount of forethought that was put into by the creators of the language, hats off!

An off-topic question -

I see that C# and .NET's CLR also proclaim of incorporating portability. But I havent come across a .NET CLR for non-Windows machines from Microsoft. Any idea why? There is an open source project called "Mono" but of course, there is no official support.
Henry Wong
author
Sheriff

Joined: Sep 28, 2004
Posts: 18978
    
  40

Praveen Kumar M K wrote:
I see that C# and .NET's CLR also proclaim of incorporating portability. But I havent come across a .NET CLR for non-Windows machines from Microsoft. Any idea why? There is an open source project called "Mono" but of course, there is no official support.


Agreed. And without Microsoft support, it is unlikely than large traction (of non-windows adoption of .NET) will occur. It's a shame, .NET (along with C#) is pretty good.

Henry
Pete Nelson
Ranch Hand

Joined: Aug 30, 2010
Posts: 147

Praveen Kumar M K wrote:An off-topic question -

I see that C# and .NET's CLR also proclaim of incorporating portability. But I havent come across a .NET CLR for non-Windows machines from Microsoft. Any idea why? There is an open source project called "Mono" but of course, there is no official support.


Using http://monodevelop.com/ 's IDE, I was able to port much of my co-worker's VisualStudio work to run under Linux. I think the problem people often run into is that most .NET apps tend to use proprietary Microsoft libraries, or hard-code paths into their programs. If you aim to write portable code, .NET & Mono can get along just fine.


OCPJP
In preparing for battle I have always found that plans are useless, but planning is indispensable. -- Dwight D. Eisenhower
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: Portability - Query on design