Win a copy of Head First Android this week in the Android forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
  • Campbell Ritchie
  • Paul Clapham
  • Ron McLeod
  • Tim Cooke
  • Junilu Lacar
  • Rob Spoor
  • Devaka Cooray
  • Jeanne Boyarsky
Saloon Keepers:
  • Jesse Silverman
  • Stephan van Hulst
  • Tim Moores
  • Carey Brown
  • Tim Holloway
  • Jj Roberts
  • Al Hobbs
  • Piet Souris

Performance issue - selecting data from a large table

Ranch Hand
Posts: 51
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator

I have two tables in my Oracle database as: ONHOLD and VALID.

In my application, the entire data from ONHOLD table is selected, then some conditions are checked on all records and the rows/records which meet those conditions are saved in VALID table. The other records which do meet the condition, remain in ONHOLD table.

My problem is that the ONHOLD table has more than 10 lakh records. So the entire process takes a lot of memory. How can I make sure that data from ONHOLD table is selected in small chunks?

I cannot select the data on the basis of ROWCOUNT or ROWNUM as it results in checking some records more than once. Like, if I select first 100 records from ONHOLD table out of which 40 are validated, then these 40 records move to VALID table and the 60 invalid records remain in ONHOLD table. Now if I again select 100 records and check them, the 60 records which had already been checked in first step are checked again.

Can anyone suggest me some way to solve this problem?
Thanks in advance!!
Ranch Hand
Posts: 1228
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Dont you have a primary key which is numeric in the source table ? If you have it then you can group the rows using that column & validate the rows.

Get first 1000 records process it and get next 1000 & keep continuing till you process all the records.

Like Select * From Source where pkyColumn Between 1 and 10000
    Bookmark Topic Watch Topic
  • New Topic