Before getting into details, let me first explain what it is I'm trying to do. The application I'm working on is supposed to be able to record a voice message from a person over the telephone in WAV format (via a VoiceXML client), store it in a database in MP3 format, and later play it back again over the phone in WAV format. Tritonus expects a stream for the encoding conversion, but the database expects a byte array. Everything works EXCEPT when I need to convert the audio data back to WAV format from the MP3 stored in the database. To perform the conversion, I'm using the Java Sound API with Tritonus. Here is the utility class I've developed to perform the conversions:
So say I have a byte called wav containing WAV data from the voice browser, and before I save it to the database, I want to convert it to MP3. I would do this:
This works perfectly. But now if I want play that audio clip back, I need to convert the MP3 data provided by the database back to WAV. So, naturally, I would write this:
But executing that code produces the following error:
Maybe I'm just too close to the problem (or too overworked...), but I just can't seem to figure out how to appease the AudioSystem's demand that I provide a stream length. Even after scouring the Javadoc and the Internet in general for two days, I remain unenlightened. So I turn to you, oh great JavaRanchers. Have you any idea why this is happening, and more importantly, what I can do to make it work!
Most humbly yours, with thanks in advance,
Robert J. Walker [ January 06, 2005: Message edited by: Robert J. Walker ]
I am recording a voice message through mobile and sending to server in by array format through JSON, i want to convert byte array format voice message into appropriate audio format and store it in database. how implement this feature?? need help.....
I’ve looked at a lot of different solutions, and in my humble opinion Aspose is the way to go. Here’s the link: http://aspose.com