Hi All there, Can you pls help me out of following problem. Scenario-- I have to read some thousand of log (txt) files and retrieve information from each of those files. I am going to do the reading in a loop & saving the complete contents of the file as a String. Then applying StringTokenizer to seperate the strings and process them accordingly. Put the retrived information in various vectors which resides in a Hashtable. Send the hashtable to client and then dig out the required information. Presently I am using RandomAccessFile to write (log) in to file and then read them as told above. Problem-- As the number of files to be read in loop goes on increasing...the speed gets hampered and system performs very slow. Can any one pls suggest me more optimised and fast working option for this ? All suggestions are most welcome. Thanks, AMit
First, study the problem to try to determine which parts of your program are slow, and which are not. Is the problem just with the file reading and parsing? Or is it also later, when the hashtable is sent to the client? Using a profiler here is highly recommended to find out what the lines of code are taking the most time. It's all too easy to spend time trying to optimize the wrong part of the program. Having said that - RandomAccessFile is evil. It's slow and poorly designed. Get rid of it. Replace with a Reader of some kind (or possibly an InputStream if you understand how character encoding works - but I don't recommend this in general). Check out this post for some ideas.
"I'm not back." - Bill Harding, Twister
Joined: Dec 13, 2000
Thanx Jim. I have done with that. Just to let you know...Problem was lying with reading of thousands of file only and not with sending the hashtable to client. I have restructured the whole strategy as below.. 1)With InputStream, read a single file into Stringbuffer. 2)Process it and extract only those data which is required for purpose. 3)Store that in arraylist. 4)Empty out the Stringbuffer. 5)Repeat above steps for next file in que. This has helped me consuming comparitively less memory and fast processing. Your valuable comments on this approch are eagerly awaited. Regds. AMit