The first thing to do would be finding out what's taking so long. Is it validation? Populating the EJBs? Writing them back to the database?
The soul is dyed the color of its thoughts. Think only on those things that are in line with your principles and can bear the light of day. The content of your character is your choice. Day by day, what you do is who you become. Your integrity is your destiny - it is the light that guides your way. - Heraclitus
Probably the #1 performance killer when doing a bulk database load is indexes. It's usually advisable to load the tables flat first, and only afterwards build indexes on them. Otherwise the load process spends inordinate amounts of time rebalancing the indexes during the load process.
Unless there's some compelling reason to do so, I'd skip Java for this kind of operation and use the native database utilities. You can load a table from another table in Oracle using a CREATE table from SELECT statement, although this may not be sufficient if your validation process is complex. For more complex variants, I'd generally either build a set of the keys of the good records and use that or else copy everything and tag or delete the bad stuff. Depending on what worked best for the situation.
Customer surveys are for companies who didn't pay proper attention to begin with.
Bulk loader utilities often split the difference. Setting up and committing a transaction is a fair amount of overhead, but as Indumuni mentioned, so is keeping a large work-in-progress/rollback queue.
So it's common for the bulk loaders to run in something like 1000 records per transaction.
subject: reading and writing large data(200000) from oracle