Win a copy of Five Lines of Code this week in the OO, Patterns, UML and Refactoring forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Bear Bibeault
  • Ron McLeod
  • Jeanne Boyarsky
  • Paul Clapham
Sheriffs:
  • Tim Cooke
  • Liutauras Vilda
  • Junilu Lacar
Saloon Keepers:
  • Tim Moores
  • Stephan van Hulst
  • Tim Holloway
  • fred rosenberger
  • salvin francis
Bartenders:
  • Piet Souris
  • Frits Walraven
  • Carey Brown

Fetching large number of records

 
Ranch Hand
Posts: 1907
1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,
I am creating a swign application which connects to Log database table and calculates stats of those records and shows to user. Number of records fetched depends on Start Time and End Time. Query is simple-
"SELECT X,Y,X,W,P from Table where Entrytimestamp between StartTIme and EndTime"
This query returns some times 300000 records or even more. I am setting setFetchRowSize to 1000. I m running the jar file with -Xms512M -Xmx1024M. Still for 300000 records it takes more than 15/20 minutes.
Is there anyway i can reduce this time ?
 
Ranch Hand
Posts: 277
Oracle Spring Flex
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Do your columns have indexes ?
 
Sheriff
Posts: 3837
66
Netbeans IDE Oracle Firefox Browser
  • Likes 1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Are you fetching all these records just to calculate the stats? If it is the case, calculate the stats in the database. It's the fastest way to go about it.
 
Arjun Shastry
Ranch Hand
Posts: 1907
1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Two columns have indexes.Currently we are doing stats on DB sie using SQL query(avg function etc) But for each set of records, we have to run 14 queries. Because logs are of different type- So for each type, we have to run the query. To minimize this, i m planning do to do stats in java.
 
Martin Vashko
Sheriff
Posts: 3837
66
Netbeans IDE Oracle Firefox Browser
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I think this is a step in a wrong direction, performance-wise. Fetching huge amounts of data just to do some statistics on them puts a big strain on the resources (especially the network).

Which database are you using? Some databases have really powerful tools for doing all kinds of aggregations directly in SQL. Even if yours doesn't, you might be better off with a stored procedure.

Fourteen queries to get the statistics doesn't sound that bad to me. Keep in mind that databases usually cache recently accessed data. Anyway, do the source rows of these statistics "overlap" (ie. are some rows processed by more than one query)? If they don't, it might be well the optimal solution. If they do, perhaps you could compute partial aggregations in the database, fetch these an combine then in Java to produce the final statistics. It all depends on what exactly do you need to compute, of course.
 
A lot of people cry when they cut onions. The trick is not to form an emotional bond. This tiny ad told me:
Building a Better World in your Backyard by Paul Wheaton and Shawn Klassen-Koop
https://coderanch.com/wiki/718759/books/Building-World-Backyard-Paul-Wheaton
    Bookmark Topic Watch Topic
  • New Topic