• Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Caching

 
Shreya Menon
Ranch Hand
Posts: 285
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,
In our application we need to build a cache for storing the data coming from the database.
Right now we are doing this cache using StateMachine.
Example of this is, From the database we are getting say 20 records at a time and
we want to display only 5 results at a time to our JSP. So right now inside our statemachine we are getting the 20 results and writing code to get 5 at a time.
Is this a good approach ?
Or, whats a better approach to do this ?
OR is it better to get only 5 results at one time from database..
All suggestions are welcome!!!
Thanks
Maya
 
Junilu Lacar
Bartender
Pie
Posts: 7465
50
Android Eclipse IDE IntelliJ IDE Java Linux Mac Scala Spring Ubuntu
 
Shreya Menon
Ranch Hand
Posts: 285
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Thanks Junilu
Maya
 
Shreya Menon
Ranch Hand
Posts: 285
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Junilu,
I was going thru the link and found really interseting.
Well, we were thinking of using entities only and finder methods to solve this approach.
But in the article it is mentioned in a couple of places that "value List Handler" works more effectively than ejb Finder methods. I was searching for an answer to "why is this ?" and couldnt find much..
Could you please tell me why a value List handler is more effective..
Or any other links!!!
Thanks
maya
 
Junilu Lacar
Bartender
Pie
Posts: 7465
50
Android Eclipse IDE IntelliJ IDE Java Linux Mac Scala Spring Ubuntu
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Maya,
In the first paragraph: "In particular, finder methods do not provide caching, scrolling, and random access to result sets, and have limited selection capabilities." This kind of implies that the Value List Handler allows you to do these.
 
Shreya Menon
Ranch Hand
Posts: 285
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Junilu,
Thanks for the reply.
I agree!!
My director's argument is, suppose we are searching for something and if the query returns 1000 results, this 1000 reults will be stored in the cache for the entire time.. And this will affect performance.
Whats your opinion here ?
Thanks
 
Junilu Lacar
Bartender
Pie
Posts: 7465
50
Android Eclipse IDE IntelliJ IDE Java Linux Mac Scala Spring Ubuntu
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Instinctively I'd probably want to say something like "well, I think we really should use a profiler instead of our gut to decide if holding 1000 rows really is a big deal". But that could get you in trouble so, I'd go over the Value List Handler pattern again with your boss and show him the part where it says "Using a Data Access Object maximizes flexibility". If it turns out that holding 1000 rows at a time in the DAO takes up too many resources, you can implement a more efficient DAO. The point is, Don't pre-optimize.
 
Matthew X. Brown
Ranch Hand
Posts: 165
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Maya- big reason why storing all 1000 rows is a really bad idea is what about changes that might occur on that subset that is cached-read into some collection object. I've done both- going back to the database-for each subgroup- and reading all into a vector- and the danger with caching all of the results is that you have old data in the collection(especially since it can be a rather large collection). In any case, you need a version or timestamp strategy to check to see if the version on the server is the one that is from the client(aka Deja Vu Pattern). I just feel that the version exception handling will happen more often if your caching all of the data.
 
Sudharsan Govindarajan
Ranch Hand
Posts: 319
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
This all depends on the Values you need. Sometimes you may need to have the data at the particular time the query is made. i.e., the data changed after the initilal query should not be reflected. If this is the case, then caching is the best. Otherwise, going back to database would be useful
 
James Ashton
Greenhorn
Posts: 2
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
The right solution also depends on other factors. I work heavily with custom java-based Content Management Systems, and have also worked on a variety of other web platforms.
In my experience I've found that for one site multiple approaches can be appropriate.
The front end may involve searches returning many rows, and depending on the popularity of the site, many concurrent searches may be taking place. It definately isn't suitable in that case to cache all rows on a session by session basis. It would probably be best to take an approach similar to that taken by maya. Alternatively, an application-scope cache could be put in place to cache to most commonly executed searches.
However, a radically different approach might be taken on the backend of the same site. Often a small number of users actually maintain the site, so heavy use of caching to maximise performance for the user (who may be executing his or her job in response to external action) may be highly appropriate.
Intelligent caching stategies need to be employed in order to ensure up-to-date data coupled with maximum performance. Cache refreshes triggered by changes in data ensure live data and maximum performance.
Whether your data is coming some kind of feed, or is manually administered via a web interface, choosing the right caching strategy is one of the most distinct controls you'll have over the performance of your site.
/James
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic