I have a directory with a large number of files (several 10,000 files). I want to process them chunk-wise and therefore want to read them in chunk by chunk. However, when I use File.list(...) or from Apache Commons FileUtils.iterateFiles(...) the respective function iterates over all files also when I have collected the files for my next chunk already which is a waste of CPU time. When my chunk size has been reached I simply return straight away false from within the FileFileter.accept method. Nevertheless, this does not prevent from all the remaining files I'm not interested in any more for the current chunk to be iterated over. And if the number if Files in the dir is large this may take some time...
Does anybody know of some library hat will stop iterating over the files in a directory when some max value has been reached? I couldn't find anything and spent quite some time surfing the Internet.
Joined: May 18, 2012
Maybe it's worth looking into Spring Batch ? There might be a learning curve involved though :P
Joined: Oct 13, 2005
Welcome to the Ranch both of you.
Doesn’t the File class return the contents of a directory as an array? In which case, you can iterate part of an array with a for loop (not for‑each).
If you're using Java 7, using a DirectoryStream may be an alternative. It (probably) doesn't retrieve all of the files and sub folders into an array, so it may be a bit more efficient. Just use the following example: