I am trying to upload large text file. Text file size would be around 100 MB (around 1500000 lines). I have to process each line from the file and have to store it into DB.
I am reading file on server(in a servlet ) using below code.
----- Above code works fine if I process file with around 10000 records, but when I process large file I am getting out of memory issue. I want to split large file into small file on server , store it in temp directory and wanted to load each file but when I am splitting the file sometimes half of the last line in file will present in one file and remaining line will present in another file first line(Example – File1-> ‘How are you’, File1->’How’ File2->’Are You’).
Also processing of file take lot of time and servlet request is getting timed out.
Please suggest any solution for this problem. If possible please give some example.
Thanks in advance for your help.
Firstly from the design perspective, I think that the file upload servlet should not be doing the DB update, have a separate utility class for that to keep things simple.
If your problem is only the time-out, putting a small AJAX to simply META-REFRESH the page should solve the problem.
Could you please suggest something to split file on server, I am able to split files but issue is , file splitting using size instead of ’ No. of lines’ and because of that last line is also splitting((Example – File1-> ‘How are you’, File1->’How’ File2->’Are You’)).
I am placing file splitting code please suggest if anything can be done?
Are you required to use a browser on the client side?
If not, a custom Java client would be much more flexible - for example it could send batches of 100 lines and keep track of how many lines have been processed so that you could resume processing if the connection was broken.
Seems to me the chance of a connection problem in the course of sending 100MB with DB processing on the server must be pretty substantial.
subject: Issue in reading large file on server, Servlet request time out