This week's book giveaway is in the OCAJP forum. We're giving away four copies of OCA Java SE 8 Programmer I Study Guide 1Z0-808 and have Jeanne Boyarsky & Scott Selikoff on-line! See this thread for details.
currently I have a ws deployed on Oracle 10.1.2.0.2 app server. I have set the session-timeout to 3 (i.e. 3min) in web.xml.
I have a client that makes 2 web service request every second. Looking at the application debug logs (i.e. log4j) the sessions
seem to be destroyed by the server at the same time. Hence if the client has been calling for 10 min this would equate to (2*60*10=1200 sessions)
created on the app server, the server then attempts to destroy 12000 all these at the same time, why?
I thought that a session lifecycle for a given user starts with the first request from that user.
It ends when the user session terminates (such as when the user quits the application or there is a timeout).