if I simutaneously create 1000 client socket. I get connection refused exceptions on a few of them since the server can only hanlde a certain number of connections. Every time the server accept, it will create a new socket with different port number to communicate with the client socket. If the server running out of ports, the client will get connection refused. I guess the number of connections granted by the server is depended on the number of available ports and how long the server will take to fulfill a job (so that the unavailable socket can be available again to accept new client connection). Is there a way to determine the optimal number of connection a socket server can handle at a given time?
Trial & error is about all that comes to mind. Of course we call that Stress & Performance Testing when talking to the boss. Your OS may document some max number or even let you configure it. But there will be enough variables - other things running on the machine, how many threads the OS can manage before it thrashes, what other resources you use while servicing a request - that coming up with a number will be hard.
A good question is never answered. It is not a bolt to be tightened into place but a seed to be planted and to bear more seed toward the hope of greening the landscape of the idea. John Ciardi
subject: How to determine the OPTIMAL number of connections accepted by the server socket