Sorry if this isnt the right subforum.
I need some direction on an approach for a design.
I have 2 tomcat servers which hosts 2 instances of an app facing users. This app communicates with multiple(3 or 4) tomcat instances of a webservices app hosted on some other servers using SOAP & URLConnection(filestream) for data. Requests to the tomcat instances are handled by a load balancer.
One of the servlets in the external userfacing app is frequently used by users. This servlet when executes actually opens a URLConnection to a service hosted on the webservices app to get the stream of data.This particular service hits DB to get the data. Around 20 DB conn in a pool.
Due to the heavy load,outofmemory server crashes sometimes (DB conn being not available etc), I am planning to restrict no of concurrent requests this particular 'Servlet App => URLConnection => WebService App' service and queue up requests to an extent.
I need advice on the best way this can be achieved. What kind of a mechanism can be implemented?Which app should be considered for setting up restriction considering the load balancers/multiple instances?
Is ThreadPoolExecutor in Java 5 a good choice for this design?
Thanks in Advance
subject: Managing concurrent processing of a particular service in a webapp