• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Tim Cooke
  • Liutauras Vilda
  • Jeanne Boyarsky
  • paul wheaton
Sheriffs:
  • Ron McLeod
  • Devaka Cooray
  • Henry Wong
Saloon Keepers:
  • Tim Holloway
  • Stephan van Hulst
  • Carey Brown
  • Tim Moores
  • Mikalai Zaikin
Bartenders:
  • Frits Walraven

Importing large number of rows

 
Ranch Hand
Posts: 18944
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi there experts...
Currently we're importing lage number of rows to our Sybase db server using a tool called bcp (probably for block copy). This requires the client to have the Sybase Client package installed.
We would, however, like to make a java tool for this - to fit into our portofolio of applications. But are there anything in the JDBC api that lets us do this in an efficient way (we're talking +500K rows with 5-6 columns).
Best regards,
S�ren Berg Glasius
 
Ranch Hand
Posts: 1879
MySQL Database Suse
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Where are you importing from?
text file (delimited or block?) --> database
database1 --> database2
table1 --> table2 within the same database
 
Anonymous
Ranch Hand
Posts: 18944
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
It is a textfile delimeted by tab or semicolon.
 
Ranch Hand
Posts: 1143
1
Eclipse IDE Oracle Java
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hi Soren (sorry, I don't know how to produce the second letter of your name on my keyboard),
I'm probably stating the obvious here, but have you considered something like:
  • Read the input file (line by line)
  • Use "StreamTokenizer" (or "StringTokenizer") to parse the input record
  • Use (JDBC) batch inserts to insert a number of rows at a time

  • Good Luck,
    Avi.
     
    Jamie Robertson
    Ranch Hand
    Posts: 1879
    MySQL Database Suse
    • Mark post as helpful
    • send pies
      Number of slices to send:
      Optional 'thank-you' note:
    • Quote
    • Report post to moderator
    Soren,
    to add to the above,
    1. create a PreparedStatement to use as your INSERT statement (reusable)
    2. loop through each line in the textfile
    3. for each line, use the String.split( "," ) method to split your input line into columns
    4. assign each String[] value to a prepared statement ? using the PreparedStatement.setString( 1, String[0] ) method
    5. add the statement to batch ( PreparedStatement.addBatch()
    6. after all the records are added to batch, use the PreparedStatement.executeBatch() method to execute all the inserts at one time.
    7. commit the inserts
    You'll find that by batching the inserts, you'll noticeable speed up your bulk insert.
    Jamie
     
    Sasparilla and fresh horses for all my men! You will see to it, won't you tiny ad?
    Gift giving made easy with the permaculture playing cards
    https://coderanch.com/t/777758/Gift-giving-easy-permaculture-playing
    reply
      Bookmark Topic Watch Topic
    • New Topic