Win a copy of Design for the Mind this week in the Design forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Importing large number of rows

 
Anonymous
Ranch Hand
Posts: 18944
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi there experts...
Currently we're importing lage number of rows to our Sybase db server using a tool called bcp (probably for block copy). This requires the client to have the Sybase Client package installed.
We would, however, like to make a java tool for this - to fit into our portofolio of applications. But are there anything in the JDBC api that lets us do this in an efficient way (we're talking +500K rows with 5-6 columns).
Best regards,
S�ren Berg Glasius
 
Jamie Robertson
Ranch Hand
Posts: 1879
MySQL Database Suse
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Where are you importing from?
text file (delimited or block?) --> database
database1 --> database2
table1 --> table2 within the same database
 
Anonymous
Ranch Hand
Posts: 18944
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
It is a textfile delimeted by tab or semicolon.
 
Avi Abrami
Ranch Hand
Posts: 1141
1
Java Oracle
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi Soren (sorry, I don't know how to produce the second letter of your name on my keyboard),
I'm probably stating the obvious here, but have you considered something like:
  • Read the input file (line by line)
  • Use "StreamTokenizer" (or "StringTokenizer") to parse the input record
  • Use (JDBC) batch inserts to insert a number of rows at a time

  • Good Luck,
    Avi.
     
    Jamie Robertson
    Ranch Hand
    Posts: 1879
    MySQL Database Suse
    • Mark post as helpful
    • send pies
    • Quote
    • Report post to moderator
    Soren,
    to add to the above,
    1. create a PreparedStatement to use as your INSERT statement (reusable)
    2. loop through each line in the textfile
    3. for each line, use the String.split( "," ) method to split your input line into columns
    4. assign each String[] value to a prepared statement ? using the PreparedStatement.setString( 1, String[0] ) method
    5. add the statement to batch ( PreparedStatement.addBatch()
    6. after all the records are added to batch, use the PreparedStatement.executeBatch() method to execute all the inserts at one time.
    7. commit the inserts
    You'll find that by batching the inserts, you'll noticeable speed up your bulk insert.
    Jamie
     
    • Post Reply
    • Bookmark Topic Watch Topic
    • New Topic