wood burning stoves 2.0*
The moose likes Sockets and Internet Protocols and the fly likes Application Crashing with Error - java.net.SocketException: Too many open files Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Java » Sockets and Internet Protocols
Bookmark "Application Crashing with Error - java.net.SocketException: Too many open files" Watch "Application Crashing with Error - java.net.SocketException: Too many open files" New topic
Author

Application Crashing with Error - java.net.SocketException: Too many open files

Vikram Chelar
Greenhorn

Joined: Aug 25, 2012
Posts: 2
Dear All,

Our production retail application in last few days has started crashing with the error as 'ERROR [org.apache.tomcat.util.net.JIoEndpoint] (ajp-0.0.0.0-8009-Acceptor-0) Socket accept failed: java.net.SocketException: Too many open files'. We need dire help to resolve this tricky production issue. To give more information, below is the highlight of the application-

Background:
Application is Web based and was hosted in three separate servers hvaing structure as below-
Web Server having Apache service for Internet Access (Connected to App Server using AJB port 8009)
App Server having Apache & JBoss server for Intranet Access (Connection from Apache to JBOss using HTTP port 8080)
DB Server having Oacle 10g for database

Below is the environment software of all the three Servers

09:35:24,985 INFO [AbstractServer] Starting: JBossAS [6.0.0.Final "Neo"]
09:35:26,393 INFO [ServerInfo] Java version: 1.6.0_24,Sun Microsystems Inc.
09:35:26,394 INFO [ServerInfo] Java Runtime: Java(TM) SE Runtime Environment (build 1.6.0_24-b07)
09:35:26,394 INFO [ServerInfo] Java VM: Java HotSpot(TM) 64-Bit Server VM 19.1-b02,Sun Microsystems Inc.
09:35:26,394 INFO [ServerInfo] OS-System: Linux 2.6.18-194.el5,amd64

Application was hosted and availalbe for access from 10-Oct-2011 and everything was working fine with a load of 50users cosistently till 20-Aug-2012. On 21-Aug-2012, application was no more accessible through Internet/Intranet and when checked in the server log files of JBoss in the path /app/jboss/jboss-6.0.0.Final/server/default/log, in the 5 server log files, below exception is thrown

2012-08-26 02:54:16,871 ERROR [org.apache.tomcat.util.net.JIoEndpoint] (ajp-0.0.0.0-8009-Acceptor-0) Socket accept failed: java.net.SocketException: Too many open files
at java.net.PlainSocketImpl.socketAccept(Native Method) [:1.6.0_24]
at java.net.PlainSocketImpl.accept(PlainSocketImpl.java:408) [:1.6.0_24]
at java.net.ServerSocket.implAccept(ServerSocket.java:462) [:1.6.0_24]
at java.net.ServerSocket.accept(ServerSocket.java:430) [:1.6.0_24]
at org.apache.tomcat.util.net.DefaultServerSocketFactory.acceptSocket(DefaultServerSocketFactory.java:61) [:6.0.0.Final]
at org.apache.tomcat.util.net.JIoEndpoint$Acceptor.run(JIoEndpoint.java:343) [:6.0.0.Final]
at java.lang.Thread.run(Thread.java:662) [:1.6.0_24]

Attached is one of server log file for your reference. Also checked the file count through Putty and below is output

[tfouser@abc ~]$ /usr/sbin/lsof -p 26844 -l | wc -l
5
[tfouser@abc ~]$ /usr/sbin/lsof -l | wc -l
1455
[tfouser@abc ~]$ cat /proc/sys/fs/file-max
3238931
[tfouser@abc ~]$ cat /proc/sys/fs/file-nr
2550 0 3238931
[tfouser@abc~]$ ulimit -n
1024

We restarted the JBOss server multiple times, restarted the OS, but still the problem exists.

Please help what and where could be issue. Highly appreciate any insight given on this high critical production issue.

Thanks!

Henry Wong
author
Sheriff

Joined: Sep 28, 2004
Posts: 18532
    
  40

Vikram Chelar wrote:
[tfouser@abc~]$ ulimit -n
1024

We restarted the JBOss server multiple times, restarted the OS, but still the problem exists.

Please help what and where could be issue. Highly appreciate any insight given on this high critical production issue.


Well, if this is a highly critical production system, then why is it only allowed 1024 file descriptors? That many file descriptors, depending on what the services are doing, can be easily used up with a few hundred users, or even a few dozen users.

How about increasing it?

Henry

Books: Java Threads, 3rd Edition, Jini in a Nutshell, and Java Gems (contributor)
Vikram Chelar
Greenhorn

Joined: Aug 25, 2012
Posts: 2
Henry Wong wrote:
Vikram Chelar wrote:
[tfouser@abc~]$ ulimit -n
1024

We restarted the JBOss server multiple times, restarted the OS, but still the problem exists.

Please help what and where could be issue. Highly appreciate any insight given on this high critical production issue.


Well, if this is a highly critical production system, then why is it only allowed 1024 file descriptors? That many file descriptors, depending on what the services are doing, can be easily used up with a few hundred users, or even a few dozen users.

How about increasing it?

Henry


Hi Henry,

Thanks for the reply. Increasing Ulimit was one of my option, but the customer systems department wants the statistics of determining during application crash that all the 1024 file descriptors are used up.
Please let me know how to track and understand through Putty if all the 1024 are being used.

Thanks!
 
jQuery in Action, 2nd edition
 
subject: Application Crashing with Error - java.net.SocketException: Too many open files
 
Similar Threads
Unable to deploy .war file.
JBoss deployment error in windows
Integrating jBPM-4.4 into jBoss 6.0.0.Final
Tomcat 6.0 server Error occured during page access
Tomcat 6.0.16 server crashed for FIRST request