Only if the machine at which the database runs can "see" the directory using standard filesystem/network protocols. You need to create a DIRECTORY object, and path for that directory must be accessible to the database. So in a specific setup - yes, but generally - no.
You can use SQL*Loader to upload data that only client can see. And you could also upload the data as a temporary BLOB or CLOB to the database and process it using PL/SQL there, though I'm not sure I'd choose this path.
But if you can put the data into a directory Oracle DB can access, you've better options than PL/SQL, namely external tables. This way you might be able to process the data in pure SQL, which would be probably faster, less error prone and less laborious than PL/SQL way.
The FAQ page for this forum contains links to Oracle JDBC Developer's Guide, it is - in my opinion - the best resource for this task. Pick the one that corresponds to the version of the JDBC driver you'll use. And you should probably use the latest version, as discussed on the FAQ page too.
However, it looks like a pretty complicated design. If you tell us what you need to achieve, eg. how will you process the data by PL/SQL, someone might perhaps offer a different approach.
If your data files are in a sensible format (e.g. CSV or fixed-format character records) and you are using an Oracle (serverside) directory, then I can recommend using external tables. If you're trying to load things like spreadsheets with multiple tabs, then you'll need to do some pre-processing e.g. to export the spreadsheet data as CSV first.
Oracle external tables allow you to treat external files as if they were normal database tables i.e. you can run SQL SELECTs against them, join them with other tables etc etc, so it is very easy to import data from an external table into a target table using SQL, or you can perform any extra processing with PL/SQL if needed.
If you specify the Oracle Loader access driver (instead of Oracle Datapump) when defining your external table, then you can use similar syntax to SQL*Loader to specify the detailed formats etc for loading/transforming/ignoring your individual data fields/records. External tables are fast, flexible and easier to maintain than arbitrary SQL*Loader scripts etc.
The external table is defined in advance via the CREATE TABLE statement, which will include the filename and record/field specifications, but it only actually tries to load data when you do a SELECT on the table. This means that any errors in your data formats may only be spotted when you actually try to read the data, not when the external table is created.
Also, you cannot write to an external table that was defined using the Oracle Loader access driver, e.g. to write extra CSV records out to the file, so you would have to use another approach for this (UTL_FILE, Java, reporting tools, Oracle APEX etc). You can write to an external table created using the Oracle Datapump access driver, but this is a proprietary Oracle export/import format, so it may be less useful to you.
My requirement is user has a zip file every month he will get a new zip file which contain text files with tab separated text(these text files are database data). When user import the zip file in client side application the text files will be unzipped into installed location of program and when he try to save them then the data(text files) has to save into database tables. We have written PL/SQL procedure to upload the data into database when the files are in the oracle server side directory. But i need to export the data from the client side (my case where the files are in installed directory of software).
In front end we are using Powerbuilder tool to develop a window in this window we programmed to unzip the selected file.
I understand you have a thick client. I'd say no sane administrator would allow your workstations to access the database server directly (unless your environment is very specific), so you have to find another option. I can envision these possibilities (sorted from these I like best to these I don't like at all):
1) Unless the amount of data is really, really large, ditch the PL/SQL code and do it all from Java. Using PreparedStatements and update batching with the latest JDBC driver will give you really good performance (insert as select using external tables would quite certainly be faster, but not necessarily so if you account for the need to transfer the flat files onto the database server).
... a long, long gap here ...
2) If you use some kind of application server (generally possible even with thick clients), perhaps you could set up some directory accessible by both the application server and database server, upload the flat files there and process them using external tables.
3) If the PL/SQL actually doesn't need to access files/external tables, perhaps you could just upload the blob/clob (as a temporary blob, say, or into a temporary table) and process it without storing it as an external file.
4) If you already have the directory set up on the database side, you might use temporary blob/clob to get the flat file to the database and then use PL/SQL file utilities to unload the blob/clob to the directory, it can then be processed by PL/SQL or external tables. This has lots of overhead from shuffling the data around and definitely looks ugly.
I’ve looked at a lot of different solutions, and in my humble opinion Aspose is the way to go. Here’s the link: http://aspose.com