You enter char values from a keyboard. The "1" key will pass a char '1' to your input code. A char takes 2 bytes(as Unicode) to store its value. In ASCII it took just one byte. Looking at its value in different ways: the char "1"'s value is 00110001 in binary, 0x31 in hex, and 49 as int. Those are 3 different ways of looking at the same value in memory.
If you cast a char to int you will see this. It is not easy to enter an int value of 1 via a keyboard.