This week's book giveaway is in the Mac OS forum. We're giving away four copies of a choice of "Take Control of Upgrading to Yosemite" or "Take Control of Automating Your Mac" and have Joe Kissell on-line! See this thread for details.
the scenario is i have a big xml file which is having millions of data.I have parsed it and got the elements of it.Now i want to insert the data in the data base.What is the best thing i can do from following and reason for doing so?
1) insert data one by one 2) insert data in batches 3) insert data all at once.
What is the meaning of the 3rd option? If the number of records is huge then you can use batch updates. It will perform better than inserting data one by one. [ March 07, 2008: Message edited by: Nitesh Kant ]
Many databases actually come with a tool to import data from XML. You might want to take a look at that possibility.
The soul is dyed the color of its thoughts. Think only on those things that are in line with your principles and can bear the light of day. The content of your character is your choice. Day by day, what you do is who you become. Your integrity is your destiny - it is the light that guides your way. - Heraclitus
<<thats fine but why adding the records in batches adds to performance? Any particular reason for this?? >> There is probably no one answer for all backends and jdbc drivers. I have tested this with sybase as a backend. In general with batches there is less work and less network IO.
For example if you commit (even if the commit is implicit) after each row the server must do its overhead for committing and communicate this information back to the client once for each row. If this work is done once per batch vs once per row it is less work on server, client, and network. Performance differences can be significant.
I would suggest you do a test in your environment and measure to see if it makes a difference for you. Post the results too.