Originally posted by Joseph Goodman:
Will the query return a 3000000 recordset? ... I am viewing a file so that I don't have to load the entire contents into RAM.
You can use the same tricks as with your file solution. You can easily select subsets of the whole data set using a database and page just as you are doing with the file.
More common, however, is to allow the user to search on various fields to pull back just the data they need to see. Very rarely should you allow someone to view
all of the data in a table. For example, you could have a date range pair of fields, a drop down for severity or module or whatever other fields you have in your log entries.
That being said, you could still make the file version work reasonably well. For one thing, buffer all of the edits and do the save all at once to minimize rewriting the file over and over. Second, by using fixed-sized records (and I assume a RandomAccessFile), updates are essentially free -- delete and insert are the trick.
If you collect all the deletes and inserts and order them as they will be applied in the file, you could probably do it quite well by using two file channels/streams. You open both at the first point of modification. Then, read ahead (buffered) and start writing along behind it with the changes. The more inserts you have, the larger a buffer you'll need. Once you get to the end, if you had more deletes than inserts, just truncate the file.
Post again if you'd like a further explanation of how that would work. The other option is to perform all updates while copying to a nwe file. Once it completes (and succeeds), swap the temp file in for the real one. This is also safer and will survive if the writing fails.