this is my first post here. So, first I will just say "hi"!
I am working on a webapp project and my manager wants me to  come up with a model that could predict the scalability of the system, and then  wants me to create some test scenarios to test that model.
Our webApp/Servlet is a little non traditional in that it frequently has sizable uploads to the server (100-200KB) and rather large downloads from the server (1-25 MB). The client for the servlet is a desktop application. All communication is over HTTP
It seems as though the bottle neck in this system will be the communication between the clients and the servlet. But, at work, the whole system is under a fast Ethernet, so I am not able to experience this bottleneck. So, I need to create an environment that would simulate the clients being off site.
So, my thought is that in the servlet's Get and Post methods, I just need to add a type of "delay" so that they client does not get the response so quickly. The delay would attempt to compensate for the difference between the download and upload speed over the Ethernet and those clients off site.
Do you think this will work? If not, what would you suggest? Are there any gotcah's to this?
As an alternative to meddling with the server, you could create a proxy - sort of like TCPMON - which could slow down all transfers by a controllable amount and also gather statistics on total bytes sent, etc.