Say, if I have a huge records in external text file (1 GB) , which contain customers credit score. The format will be "peter;400" line by line. These scores are ONLY those that has been updated recently by credit companies.
Then I need read, parse, update records in database. How to get the best performance / practices in pure JDBC ?
Here is my thinking
1. create two threads, one is to read, parse, another is to update records in DB. The pipe is ArrayBlockingQueue.
It's hard to say in advance if there would be any benefit in making this a multi-threaded program. I'd just start with writing a small and simple single-threaded program that does a batch update for every N records (and you'd have to try out which value of N works best).
If the performance is not good enough, try more complicated things like multi-threading. I suspect that you might find that a much more complicated multi-threaded program doesn't perform a lot better than a simple single-threaded program.
Edward Chen wrote:The pipe is ArrayBlockingQueue. .
what is pipe?
Besides how no 2, 3 bar you from using no 1 alternative. These alternatives are not mutually exclusive, isn't it? If multi threading used wisely, actually gives better performance. So, while one thread reads file, processes and is waiting for database connection, other barge in opening file to read from next line onwards. Mind that, I am not saying one is reading another is updating, then you have to handover processed records to other thread to let it update, an inter thread communication. That may be messy.
subject: best performance / practices for read and parse files ?