I need to write a program that:
(1) Joins a few huge tables (and also maybe a few small static configuration tables), producing a VERY LARGE result set, and
(2) Iterates through the result set to write an extract file.
These tables are used intensively by thousands of users for online transaction processing.
However, the rows I am selecting are no longer being actively updated.
I know how to use
JDBC to select my data and to iterate through the result set.
Is there anything special I need to do to ensure that the tables themselves are not being locked while the program is run -- that my program does not interfere with ongoing transaction processing of the more current routes?
Or is this automatically taken care of by the design of Oracle's locking mechanisms.