Hi all, We are intending to create a new application a prerequisite of which will be writing a lot of old data to the database from csv files. At the moment i am testing writing this data using stored procedures/CallableStatement rather than JDBC Statements. I have set up a record datatype and an array type that holds a number of these records on the database. I am trying to find what the optimal size of the array should be to write my data to the database as quickly as possible. E.g i want to write 100 rows to the database so i put a row into a single record. I then put 30 records into an array and execute the callable statement. This means that to insert the 100 rows i would call the stored procedure 4 times. Does anyone know what determines the optimal number of records to put in the array or where i might find this out. I have not come accross anything in the Oracle documentation. Right now through my tests i have determined a number of around 50 elements in the array to be most efficient. I know that the results will depend on the size of the record but im not sure of the underlying drivers.................. Any ideas appreciated
I second Mark's hint. For a one time batch transfer of data to Oracle, SQL*Loader will almost certainly be your most efficient option.
Joined: Mar 14, 2001
I took a look at SQL Loader and it seemed like it was'nt that easy to insert complicated data. We had to insert some fields into multiple tables and also we have a lot of integrity constraints so some of the inserts are conditional. I thought this might be easier if we write some stored procedures.... Another reason is that the only way i could find of executing SQL Loader is using runtime.exec(). However i think that this means that your Server code would have to be on the same box as your database to be able to execute the SQL Loader exe. This makes the application less portable.........
Joined: May 15, 2002
I'm glad you elaborated a bit more about the scenario -- it helps with diagnosis. What about using Java to prepare SQL*Loader input files, which you can then use with the sqlldr executable? I think you could probably still capture the conditional inserts you need. (Alternatively, I think SQL*Loader supports conditional inserts, however the expressivity it provides may not be enough) To capture the need for specifying table constraints, you could generate a SQL schema file (either manually or with Java). Then you could use SQL*Loader to populate the tables you create. However, if performance of bulk insert is your goal, and you know that your data already obeys the constraints you need, it will likely be more efficient to first bulk load the data (with no constraints) and then modify the tables with constraints later. Generally, things like constraints and indexes will slow down the shoveling process. [ May 22, 2002: Message edited by: Greg Barish ]
Joined: Mar 14, 2001
Hi, Thanks Greg. We have already decided that the expressitivity that we need cannot be provided by the Loader Control files. As well as inserting some field into multiple tables and the integrity constraints we also need to use some fields in the csv file to just conduct an update on sonme rows in the table. All this complex logic is much easier to write in Java. If anyone does have any ideas about the size of the array i would appreciate it. I have since conducted further tests and my results have become even more confusing. My best test so far to insert 400 rows was to actually put 400 records in the array and then send it to the stored procedure. This means only one call to the database so maybe this is why the performance improves.... Also i am now considering writing my stored procedures in java rather than PL/SQL. If anyine has any tips on this id appreciate them. Im just hopefull that the Java Stored Procedures have the same level of performance as PL/SQL. I have read in one article that they can be 1.5 times slower than PL/SQL for databse-centirc routins. Hopefully this isnt the case.......
You know you can use an SQL Loader file as a "External" Table in 9i, then do all your conditional and transformation from that table to your other tables with plain ol, SQL or PL/SQL. But if you don't have 9i, then you can't Mark
If you are using the same insert statement over and over in a loop, look into PreparedStatements. Their performance may surprise you when used correctly. For performance comparisons and tips, check out Chapter 19 from the book "Java Programming with Oracle JDBC" by Donald Bales. Jamie
I’ve looked at a lot of different solutions, and in my humble opinion Aspose is the way to go. Here’s the link: http://aspose.com
subject: Optimal number of records in an Array passed to a Stored Procedure