In my present case i am trying to update around 3000 records, in my code after every 1000 records i am executing that batch , which is taking huge time ...nothing is coming its just hanging for huge time .. its just prints this ------------------->before addbatch-from if->1000 thats it
And in the DB there primary key constaints on both the (ITEMCODE+SERIALNUMBER) and there is a index on CHANNELCODE..
Please suggest me the changes...
Is this the right way to update huge data in the table ...or is there any other way to update huge data.
The CHANNELCODE index does not help for this query. It will cost:
- you don't use it in the where clause, so the index is not relevant to speed up the query.
- you are updating CHANNELCODE, so the index has to be updated every time you execute the update.
-- so far for CHANNELCODE.
Next ITEMCODE and SERIALNUMBER:
I don't know if these columns have an index.
But if they have, they will probably not be used, because you use UPPER() around them.
-- so far for ITEMCODE and SERIALNUMBER.
Is your rollback tablespace big enough to deal with 1000 changes to your database at a single time?
Try your code with committing faster first (say: 2 records).
If that works, you confirmed that your logic is ok, and you can try to increase to a workable amount.
If possible: Commit after an update. This will free up your rollback segment.
OCUP UML fundamental and ITIL foundation
Joined: Jan 23, 2008
Thanks fo your valuable response ..I will try removing the indexes on the column..Thanks a lot