• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Jeanne Boyarsky
  • Ron McLeod
  • Paul Clapham
  • Liutauras Vilda
Sheriffs:
  • paul wheaton
  • Rob Spoor
  • Devaka Cooray
Saloon Keepers:
  • Stephan van Hulst
  • Tim Holloway
  • Carey Brown
  • Frits Walraven
  • Tim Moores
Bartenders:
  • Mikalai Zaikin

Sorting by using Comparator

 
Ranch Hand
Posts: 469
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi,
I am doing following to sort addressbook:
(1) Whatever new contact user add, everytime we save firstname, lastname, companyname & email address in file(Per contact it's seperate file on local disk of server)
(2) For example,if user wants to sort by companyname, in java class I do following:
(i) I get list of file names in vector. Then in for loop, I read each file, get firstname, last name, company name & email and put in Hashtable.
(ii) Then from hashtable, i get first name, last name, companyname & email address and put in SortEngine(My Java class which use comparator to sort the things)object.
(iii) Then I put SortEngine object in List.
(iv) Then I use, Collections.sort(List,Comparator)
(v) Finally, I get first name,last name, company name & email from list & put in vector & finally I put in table of HTML.
Now the problem,is if i have 100 address only, then also it takes 4+ minutes to sort the things. It is very slow. How can I make this faster???
Thanks,
Angela
 
Wanderer
Posts: 18671
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hard to say. There's no good reason why it should take that long, so probably there's something inefficient in your code. Which part of the process is taking most of the time? You can insert some timekeeping statements like:

Insert these after every one of the steps i - v listed above. (And maybe more places inside your code.) Then by looking at the output you can get a better idea of where the problem is. You can also use a profiler like JProbe or OptimizeIt - though most of these cost money.
If indeed it's the sort() method that's taking most of the time, then probably your Comparator code is the problem. You can try posting it here so we can look at it.
By the way, what's the purpose of the Hashtable in your code? Can't you just build a SortEngine object immediately after reading firstname, last name, company name & email? Why put this info in a Hashtable and then immediately retrieve it? This sounds wasteful and confusing - but perhaps I'm misunderstanding something.
 
author
Posts: 14112
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by Jim Yingst:
You can also use a profiler like JProbe or OptimizeIt - though most of these cost money.


The JDK is also shipping with a simple profiler: Use the -Xrunhprof:cpu=samples,file=log.txt switch on the java command. You will find the profiling information in the log.txt file.
 
Ranch Hand
Posts: 104
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
If I understood correctly you use one file per contact so, for 100 addresses you would need to use 100 files and thus you would need to open 100 streams to read everything (it depends what kind of InputStream you use as well). But this is unbelievably inefficient. I would STRONGLY recommend to change your approach to this. Try to use only one file or DB. With current approach you will not be able to optimize your code, don't even try.
 
Ilja Preuss
author
Posts: 14112
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

Originally posted by raimondas zemaitis:
If I understood correctly you use one file per contact so, for 100 addresses you would need to use 100 files and thus you would need to open 100 streams to read everything (it depends what kind of InputStream you use as well). But this is unbelievably inefficient. I would STRONGLY recommend to change your approach to this. Try to use only one file or DB. With current approach you will not be able to optimize your code, don't even try.


Are you sure? I don't think that opening 100 files should last anything near to four minutes, though I might be wrong. Anyway notice that DBs come with an performance (and complexity) overhead, too. For small scale applications direct file access might possible be more efficient.
Use the profiler, Luke!
 
Ranch Hand
Posts: 125
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
If you are using a hashtable for each record, that might be some part of the problem. I don't know how big the default size for a hash table is, but it could have hundreds or thousands of entries when you are using only four. You can specify the size of the hashtable you want when you create it. Alternatly, I might suggest creating a new object type called Record that implements the Comparable interface and has a field for each of name, address, etc.
 
Ranch Hand
Posts: 183
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
For your particular problem, where you have a separate file for each user.
A more efficient method would be to sort during insertion, by using several TreeMaps. Something like:

Or something like that :roll: Note, none of this code is tested, and is bound to be buggy
Hope it helps...
 
With a little knowledge, a cast iron skillet is non-stick and lasts a lifetime.
reply
    Bookmark Topic Watch Topic
  • New Topic