I am working on a project where I have to import the data from a large (50meg) Excel file into a JavaDB.
First I converted the .xls file to a semicolon delimited .cvs file.
Originally I was using a buffered queue to read the file line by line.
Each line being a row.
Like this:
public
String[] loadSpreadSheet(String filename){
try
{
String currentLine;
String temp[] = new String[1024];
BufferedReader buffReader = new BufferedReader(new FileReader(filename));
while((currentLine = buffReader.readLine() ) != null)
{
temp[] = currentLine.split(",") ;
/* debug */
for(int i = 1; i<= temp.length; i++){
System.out.println(temp[i]); // to check that the file was getting read correctly, it was.
}
/* /debug */
}
buffReader.close();
return temp;
catch etc.
}
So the function returns a string array which I woudl then parse through in another function to fill in
field names for a JavaDB (derby) database. Which would be done row by row until this is read in.
In the function that called this,
I would use this to input the text field data into an SQL query string
I realize this will take some time to do.
I was given the hint to read it in page by page.
So I wondered if there was a built in function that would do this instead and came up with this:
public void importData(Connection conn,String filename)
{
Statement stmt;
String query;
try
{
String[] dbString = this.loadSpreadSheet(filename);
stmt = conn.createStatement(
ResultSet.TYPE_SCROLL_SENSITIVE,
ResultSet.CONCUR_UPDATABLE);
String importTableQuery = "CALL SYSCS_UTIL.SYSCS_IMPORT_TABLE" +
"(null, 'EXCELSPREADSHEET2007', 'filename', ';', ',', null,0)";
stmt.executeUpdate(importTableQuery);
}
catch{ //etc.}
}
Anyhow, this is throwing all kinds of stuff.
So I am moving back to my original idea of row by row.
However, the rows contain random types of data in the .csv file. it oculd be 100 integers in a row, or a bunch of VARCHAR interspersed with INTEGERS.
I know I am missing something obvious.