I am writing a program to analyze files in a directory. Basically, the java program creates a CSV file for loading into a database, where I can check md5 sums to find duplicate files.
The problem I am running into is a common one: IOException due to "Too many open files". I see a lot of answers that involve trying to change the OS, rather than resolve the issue in the Java code ... but I'd really like to find a way to mitigate this in the code, rather than forcing changes to the system.
I have an ArrayDeque (fileQueue) that gets populated with the canonical path of the files to process. The exception is occurring when I call my processFiles() method:
I'm setting my File object (f) to null in a finally block ... I'm doing the same to my custom FileInfo object (a wrapper to format the file information). I've reduced THREAD_COUNT from 10 to 2, and still get the IOException. I don't see where files are "staying open" here. In a desperate attempt, I even tried sleeping the thread, to see if that would allow time for other threads to close files, but still no luck.
Does anyone know how to handle this more gracefully, or see what I am doing wrong?
(please, don't tell me about ulimit and files-max ... I understand what that should do, but don't believe I should have to do that!)
Any help on making the Java program more graceful would be greatly appreciated (I KNOW there's a way to make this program behave!)
In preparing for battle I have always found that plans are useless, but planning is indispensable. -- Dwight D. Eisenhower