Well, I suppose you could use classes from java.nio.charset, like
Offhand I doubt you'll see significantly different performance from this. It might even be worse. The benefits from using NIO classes (if any) are often most noticeable if you use ByteBuffers and CharBuffers rather pervasively in your application. But frankly they're also a pain in the butt to use in many cases, and improved performance is not guaranteed.
Another possibility that may work for you is to simple cast each char to a byte. This can be fast, but its accuracy depends on the charset you're using. In ISO-8859-1 it works for all char values under 256; for many other charsets it only works for char values under 128. If that's not true for your data, you really really should not use this method. If it is true, then I still don't know if it's faster, but it may be worth a try.
Probably the most important thing at this point is to use some sort of profiler to determine where your performance problems really lie. You can spend an awful lot of time trying to optimize the wrong part of the program otherwise. Googling "java profilers" will lead to many different ideas and recommendations, depending on budget. JProbe and JProfiler are common recommendations if you can afford them. There are also many open-source profilers to be had for free. JAMon API is a good free tool I like. Hope that helps...