This week's book giveaway is in the OO, Patterns, UML and Refactoring forum. We're giving away four copies of Refactoring for Software Design Smells: Managing Technical Debt and have Girish Suryanarayana, Ganesh Samarthyam & Tushar Sharma on-line! See this thread for details.
I've been playing around with exception writing and extended an example to take an argument(int) from the command line, parseit, and feed it into a mthod for compuation. All works well except for the fact that I get 0.0 as the output. Thanks in advance. Here's my code:
Here's what happens: 1. Java interprets "d" and the return from doIt() as ints. 2. Therefore, the product of "d" and "doIt()" is an int. 3. Java interprets "1" as an int; 4. Java integer-divides 1 and the multiplication product, because each of the arguments to "/" is an int. (Integer-dividing means you divide and disregard any remainder.) So if your args is 17, integer-dividing 1 by ( 4 * 17 ) yields 0. As another example, "7 / 2" in Java yields 3. 5. Java casts this integer (0) to a float, yielding the 0.0 you see. The solution, here, Brian, is to fool Java into plain old dividing instead of integer dividing. Explicitly identifying "1" as a float to Java should do the trick:
When I run this code with an args of 17, 0.014705882 is returned. Hope this helps, Art
Joined: Feb 03, 2001
Thanks Art for that clear explaination. I hope to return the favor someday.