With char, do you mean native char? If so, you will have a challenge - jchar is usually twice as big as (native) char, which is more compatible with Java's byte (jbyte).
Of course the quick and dirty solution that only works for (extended?) ASCII is just casting the jchar to a (native) char, cutting off the extra byte. But like I said, it only works if that byte is actually 0, or you will loose data.
If it's a full Unicode jchar, then I wish you luck. You will have to encode the jchar to (native) char, and that's not an easy job. Well, not from native code. You can do it in Java using the java.nio.charset package. So perhaps that is a better option - converting the (Java) char to (Java) byte using java.nio.charset, then use that (Java) byte in native code. It will turn into a jbyte, but jbyte and (native) char are compatible.
Code for converting a (Java) char to a (Java) byte:
That's Java code, to convert a jchar into a jbyte. You should then pass the jbyte to the native code, because converting a jbyte to a char is quite easy. In pseudo code:
This does not make chars a string though.