We were running a performance testing, and bombarded the application server. Soon we ran into "Network adapter could not establish the connection" problem. After checking, I found the connection pool wasn't even configured, that meant the whole application was running without a connection pool!
I was using Oracle Application Server and a native data source. Someone said the pooling was just making things faster, not necessarily causing the problem of running out of connections. So what would happen if there was no connection pooling with just a plain old data source.... Would the number of available connections be limited (very) or unlimited?
For me it seemed straightforward that this was caused by missing connection pooling, however, I didn't know how to answer this question. Could you guys lend a help?
As per my understanding, the problem was not caused by "not using connection pooling". If you've exhausted your connections, then most certainly, it is caused by the application not releasing the connection(s) by closing them appropriately.
You are right when you say that if you were using connection pooling, this problem would not have occurred. This is because the Pool Management Program on the AS takes care of closing a connection, even if your program doesn't. You can see this in action in JBoss, where it produces a stack trace every time it closes a connection for you ( I don't know if OAS does the same ).
The max number of connections is not unlimited, but set as a parameter in the configuration of the DB Server. However, the number of available connections is totally independent of whether you're using Connection Pooling or not. You must understand that Connection Pooling Program is nothing but the same JDBC code, with a wrapper around it to manage the connections more efficiently, freeing you from the headache of doing the same ( which could be error prone as in this case ).