Actually NO values of ch are greater than -10 and less than 10. When you read a '0' from the console, the value as an integer is 48 -- the Unicode value of the character '0'. Likewise, '1' is 49, '2' is 50, etc. The odd thing here is that you seem to realize that, as you're subtracting '0' from ch before using the numeric value.
The idiomatic Java way to do this would be to use the static methods Character.isDigit() to test if a character is a digit, and Character.digit() to convert from a character to a number. Then your program works internationally, not just for ASCII-using locations like the US.
is asking whether the character just read is the same as the one on the stack. Since the only ones you ever push on the stack are numbers, this will be true if, and only if, you just read a number. You'll push it on, and then immediately pop it off.
I can't really tell what you intend to happen here.