I am having some trouble understanding unicode as it relates to Java. I have wrote a test method to gain some understanding and it has confused me further.
The method that I wrote was to understand how to produce a Hex representation of a UTF16 encoded string.
Here is a excerpt from the code I wrote:
When I stop this in my eclipse debugger I get the following byte string back:
byte[4] = [-2, -1, -32, 19]
Given it's UTF16 encoding, I would have expected only two bytes back from 0x2013 (en-dash). What am I misunderstanding or what is wrong with my code ?
Any help would be appreciated
Thanks