aspose file tools*
The moose likes JDBC and the fly likes Importing large number of rows Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Databases » JDBC
Bookmark "Importing large number of rows" Watch "Importing large number of rows" New topic
Author

Importing large number of rows

Anonymous
Ranch Hand

Joined: Nov 22, 2008
Posts: 18944
Hi there experts...
Currently we're importing lage number of rows to our Sybase db server using a tool called bcp (probably for block copy). This requires the client to have the Sybase Client package installed.
We would, however, like to make a java tool for this - to fit into our portofolio of applications. But are there anything in the JDBC api that lets us do this in an efficient way (we're talking +500K rows with 5-6 columns).
Best regards,
S�ren Berg Glasius
Jamie Robertson
Ranch Hand

Joined: Jul 09, 2001
Posts: 1879

Where are you importing from?
text file (delimited or block?) --> database
database1 --> database2
table1 --> table2 within the same database
Anonymous
Ranch Hand

Joined: Nov 22, 2008
Posts: 18944
It is a textfile delimeted by tab or semicolon.
Avi Abrami
Ranch Hand

Joined: Oct 11, 2000
Posts: 1135

Hi Soren (sorry, I don't know how to produce the second letter of your name on my keyboard),
I'm probably stating the obvious here, but have you considered something like:
  • Read the input file (line by line)
  • Use "StreamTokenizer" (or "StringTokenizer") to parse the input record
  • Use (JDBC) batch inserts to insert a number of rows at a time

  • Good Luck,
    Avi.
    Jamie Robertson
    Ranch Hand

    Joined: Jul 09, 2001
    Posts: 1879

    Soren,
    to add to the above,
    1. create a PreparedStatement to use as your INSERT statement (reusable)
    2. loop through each line in the textfile
    3. for each line, use the String.split( "," ) method to split your input line into columns
    4. assign each String[] value to a prepared statement ? using the PreparedStatement.setString( 1, String[0] ) method
    5. add the statement to batch ( PreparedStatement.addBatch()
    6. after all the records are added to batch, use the PreparedStatement.executeBatch() method to execute all the inserts at one time.
    7. commit the inserts
    You'll find that by batching the inserts, you'll noticeable speed up your bulk insert.
    Jamie
     
    I agree. Here's the link: http://aspose.com/file-tools
     
    subject: Importing large number of rows