wood burning stoves 2.0*
The moose likes Performance and the fly likes Any optimization solution? Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login


Win a copy of Android Security Essentials Live Lessons this week in the Android forum!
JavaRanch » Java Forums » Java » Performance
Bookmark "Any optimization solution?" Watch "Any optimization solution?" New topic
Author

Any optimization solution?

Kiran More
Greenhorn

Joined: Dec 29, 2003
Posts: 10
Hi all.
I have a Jtable and a Tablemodel for it which uses Vector to store row data.I am supposed to read a really large file(>20 MB) (>10,000 rows).
I want to optimize my code to increase performance any suggestions for that? Will converting to native exe dramatically improve my performance?Is there any way to quicken disk I/O (currently i use bufferedreader)
Thx in advance for u r valuable time.
Regards,
Kiran
Donny Nadolny
Ranch Hand

Joined: Mar 05, 2003
Posts: 32
Just a quick suggestion for perceived performance (how fast your users think your application is), I suggest reading just the first bit (eg one page worth of data) and then display it, then continue loading the rest of the file in the background or as needed.


- Donny Nadolny<br />The pen is mightier than the sword, and considerably easier to write with.
William Brogden
Author and all-around good cowpoke
Rancher

Joined: Mar 22, 2000
Posts: 12761
    
    5
One possibility would be to read the entire file into memory as a byte[] - only read Strings from it as necessary - that assumes you only need parts of it at any one time. Reading the whole thing as characters means lots and lots of expensive byte to char conversionss.
Bill
Jim Yingst
Wanderer
Sheriff

Joined: Jan 30, 2000
Posts: 18671
Here's a mishmash of different ideas and suggestions to consider. They don't all go together, because there are too many things unknown about your requirements. You'll have to decide which of these make sense for you, and which don't.
First - are the records all constant length? Or do they vary (e.g. using a separator such as a newline)? If they're constant length, you can devise a solution that looks up rows only when they are necessary, because knowing the record length, it's easy to calculate where in the file a given row is located. You can create a custom TableModel implementation that looks up the row from the file as needed. Probably with a cache, so that recently-accessed rows are still in memory and don't need to be reread.
If the rows aren't constant length, then this would be considerably more difficult. You could still do something like this, but you'd need to read the whole file first to create some sort of index to allow you to subsequently find rows quickly. And creating such an index will take some time, much like reading the whole file into memory would.
An important question here will be how much memory you can expect to have available. Can you afford to use 20 MB of RAM to read everything into memory? If you then convert all the data to Strings that will probably cost another 40 MB. Which is not a big deal on many machines, but on others, it's big. If you can't afford to put everything in memory, that will probably require you to do something like create an index.
Another option is to read the one big file, and rewrite the contents into a series of smaller files, with a known number of rows in each.
I'f you're using a modern JDK (1.4+) then you could probably benefit from using the java.nio classes here. There are many options, but the first one I would try is to use a FileChannel to map() the entire file contents to a MappedByteBuffer. After that - well, depends on the format of the file, and how much time you want to spend optimizing. If it's all text, then maybe use Charset to decode the whole file into a CharBuffer. (Though that could use more time and memory than you want.)
Donny's point about perceived performance is important. The key is to make the first page(s) of data available as soon as possible. I'd create one thread to start reading the file using one of the techniques mentioned above, and then figure a way for other threads to check if the data they need is available yet, and if necessary wait until it's available. E.g. use a custom TableModel combined with the idea of rewriting the main file as a series of smaller files, say 100 rows in each. A request for table data will be sent by the event handling thread. That could first check the cache to see if the row has already been read, then check to see if a file exists with name (rowNum %100) + ".dat". If the file exists, read the file and store its data in the cache, and then return the data that was originally requested. If the file doesn't exist, wait() until it does. Meanwhile you've got a separate thread that is busy reading the big main file and creating new small files - every time it completes a file, call notify() so waiting threads can check to see if the new file is what they need. There are quite a few ways to do something like this; I'm just suggesting one possibility. The key is to do most of the file processing in a separate thread, and have a way for other threads to find out when data they need is available.


"I'm not back." - Bill Harding, Twister
Ilja Preuss
author
Sheriff

Joined: Jul 11, 2001
Posts: 14112
Very good suggestions above!
Originally posted by Kiran More:
Will converting to native exe dramatically improve my performance?

Very unlikely - as Java is unlikely to be your performance bottleneck.


The soul is dyed the color of its thoughts. Think only on those things that are in line with your principles and can bear the light of day. The content of your character is your choice. Day by day, what you do is who you become. Your integrity is your destiny - it is the light that guides your way. - Heraclitus
Vikalp Singh
Ranch Hand

Joined: Dec 29, 2002
Posts: 50
More jee,
above is true...Problem explain in details.
so that we can help you.
in some senerio.. we have also used the jtable and got out of memory error or crushing of the server.
Regards
Vikalp Singh
Yuriy Grechukhin
Ranch Hand

Joined: Jan 16, 2004
Posts: 41
TableModel is a very powerful concept. The view (JTable) only calls getValueAt() when it needs to redraw itself, like when you scroll. So you might wanna think of rewriting that class not to store all the data into Vectors, but rather reading the file as needed, if it is possible for your requirements.
I also like the idea of reading the contents of file into a byte[]. I don't know the format of the file, but if you need to display a line of file per line of table, it might make sense to read the content into a two-dimensional byte array, then if would be easier to write getValueAt(): for the requested row you'll just ask for array[row], and the parse the resulting byte array.


The sword of destiny has two blades, one of them is you.
Kiran More
Greenhorn

Joined: Dec 29, 2003
Posts: 10
Thank you guys.This site is gr8.The kind of suggestions you have given are excellent.
RAM is not a problem for me.I have to read entire data at once because I do Excel-type autofiltering operations (For people who do not use Excel.Basically filtering based on keys)on the table.
I had thought I/O was main bottleneck for performance but now from the suggestion you have given I think i should use good threading techniques to optimize the performance.
Thank you once again.There is a lot of gr8 java talent here at javaranch.

 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: Any optimization solution?
 
Similar Threads
Insert a 7k String into Oracle CLOB?
any positions for performance tuning/benchmarking ?
Report Based on JUnit Test Case
db access: entity bean vs stateless sb
performance implications of comments in JSP files ???