Thanks much for your replies. As i said earlier i need to process the file which contains around 450 to 500 millions with 30 to 40 columns. Now these columns have to be encrypted using SHA-512. As of now we loop through entire flat file read the record by record and column by column. Now each column will be digested using SHA which return some 32 bytes and again some process using those bytes.
Finally record will be written to output file and the process will continue till last record.
Now I am trying to process the file using thread concept rather than one by one to improve the permonance.... What is the best way to process this file? is thread method will be helpful? any other better way? Please explain me... your help will be much appreciated
It depends. You first need to identify what is causing your performance problems. Is it
Reading the file
Decrypting the file
Processing the data
Writing the data to the output file.
Until you know where the problem is there is no point in trying to fix it. You also need to have benchmarks for each stage to know whether a change has had any effect. It's no good making multiple changes together as some of them may have a positive effect and others may have a negative effect.