I have a directory with a large number of files (several 10,000 files). I want to process them chunk-wise and therefore want to read them in chunk by chunk. However, when I use File.list(...) or from Apache Commons FileUtils.iterateFiles(...) the respective function iterates over all files also when I have collected the files for my next chunk already which is a waste of CPU time. When my chunk size has been reached I simply return straight away false from within the FileFileter.accept method. Nevertheless, this does not prevent from all the remaining files I'm not interested in any more for the current chunk to be iterated over. And if the number if Files in the dir is large this may take some time...
Does anybody know of some library hat will stop iterating over the files in a directory when some max value has been reached? I couldn't find anything and spent quite some time surfing the Internet.
If you're using Java 7, using a DirectoryStream may be an alternative. It (probably) doesn't retrieve all of the files and sub folders into an array, so it may be a bit more efficient. Just use the following example: