This week's book giveaway is in the Mac OS forum. We're giving away four copies of a choice of "Take Control of Upgrading to Yosemite" or "Take Control of Automating Your Mac" and have Joe Kissell on-line! See this thread for details.
Hi there experts... Currently we're importing lage number of rows to our Sybase db server using a tool called bcp (probably for block copy). This requires the client to have the Sybase Client package installed. We would, however, like to make a java tool for this - to fit into our portofolio of applications. But are there anything in the JDBC api that lets us do this in an efficient way (we're talking +500K rows with 5-6 columns). Best regards, S�ren Berg Glasius
Soren, to add to the above, 1. create a PreparedStatement to use as your INSERT statement (reusable) 2. loop through each line in the textfile 3. for each line, use the String.split( "," ) method to split your input line into columns 4. assign each String value to a prepared statement ? using the PreparedStatement.setString( 1, String ) method 5. add the statement to batch ( PreparedStatement.addBatch() 6. after all the records are added to batch, use the PreparedStatement.executeBatch() method to execute all the inserts at one time. 7. commit the inserts You'll find that by batching the inserts, you'll noticeable speed up your bulk insert. Jamie