I'm calling a method, scrTm(), many times in a loop. In this method it executes a perl script called "hybrid2.pl".
hybrid2.pl needs two input files, "s1.seq" ve "s2.seq". I write these files in the same method, scrTm(). Method basically takes two character arrays as inputs(seq1 and seq2) and writes them into files "s1.seq" ve "s2.seq".
At first steps(up to around 100) the perl script is executed with no problem, however when the number of steps increase it gives an exception as:
I guess, I am ensuring that the files are written correctly and it is an issue of memory. There should be a simple solution to this, but I'm new in Java.
Please help me how to fix it.
Is there some reason why you don't close stream 'out1' and 'out2' ? These streams are are still open when you use the associated files though Runtime.exec().
P.S. When you chain streams as you are doing you only need to close() the outer stream since this is required to propagate through to the chained stream.
P.S.1 You should not need to flush() a stream before closing it since the contract for close() says it should perform a flush() before actually closing.
P.S.2 You should close streams in a finally() clause to make sure that even if you have exception the streams are closed. This is a simplification of the standard forceClose() method I use in most of my work.
Thanks for reminding the open files, but it did not fix the problem alone. I looked the web-page you referred, it is a good trouble-shooter for Runtime.exec().
Using the StreamGobbler class mentioned there fixed all the problem. It seems like flawless for now.