This week's book giveaways are in the Java EE and JavaScript forums.
We're giving away four copies each of The Java EE 7 Tutorial Volume 1 or Volume 2(winners choice) and jQuery UI in Action and have the authors on-line!
See this thread and this one for details.
The moose likes Oracle/OAS and the fly likes Large data file read Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login


Win a copy of The Java EE 7 Tutorial Volume 1 or Volume 2 this week in the Java EE forum
or jQuery UI in Action in the JavaScript forum!
JavaRanch » Java Forums » Products » Oracle/OAS
Bookmark "Large data file read" Watch "Large data file read" New topic
Author

Large data file read

Jon Trautman
Greenhorn

Joined: Mar 12, 2004
Posts: 1
Hello,
I'm trying to read a file with very large records in it from Unix.
I have set up the following script:
declare
file_id UTL_FILE.FILE_TYPE;
linedata VARCHAR2(28000);
BEGIN
file_id := utl_file.FOPEN( '/tmp', 'largefile.txt', 'r' );
LOOP
UTL_File.get_line(file_id,linedata);
dbms_output.put_line(linedata);
END LOOP;
utl_file.fCLOSE(file_id);
EXCEPTION
WHEN utl_file.read_error THEN
RAISE_APPLICATION_ERROR(-20001, 'utl_file.read_error');
END;
I get a utl_file.read_error ORA-06512 when trying to read a line with about 6000 bytes in it. Smaller lines are ok - is there some limit or workaround ?
I'm using Oracle 8i.
Any help please ?
Thank you
Jon
 
Consider Paul's rocket mass heater.
 
subject: Large data file read