posted 14 years ago
It is impossible to give you a good answer without knowing the intimate details of your deployment environment.
My employer has an application that exhibits this kind of behavior. The developers blame it on load balancing, that until a certain load threshold is reached, the resource pools (caching, database pool, etc) are not fully initialized so performance suffers. Which begs the question, why don't they set an idle threshold rather than let all those resources expire only to reinitialize them for the next user.