[Sussex] PANIC: Too many files open error
Dominic Clay
dominic.clay at europrospectus.com
Tue Dec 17 10:09:03 UTC 2002
Hi all,
I am having a nightmare with some java code that I am working on, and people
are starting to get uneasy with me...
It is giving me a 'Too many files open' error.
The prog is basically making lots and lots of JDBC calls.
Now the bit I am unsure of is this bit of assistance I found on the .net .
http://www.openldap.org/lists/openldap-software/200203/msg00482.html
<snip>
Your exceeding the default 8192 (if this is Linux) file handle limit.
This is a pretty low limit, raise with
"sysctl -w fs.file-max=32767" as root. If your on a RH box put
"fs.file-max = 32767" in /etc/sysctl.conf so it gets redone next time
you boot.
</snip>
Is this a sensible thing to do? Something tells me, I am just hiding from
the real problem, which could be my connection pooling is not really up to
scratch!!!
That said, how much harm would I be doing by changing this setting? If it
woreked then it buys me time to fix stuff properly.
What exactly is a 'file handle'? Why would I be exceeding it? Is it
something to do with a TCP connection?
_My_Research_
I have done a 'cat file-nr' from /proc/sys/fs during the process and I have
noticed the middle number (free handles?) reaching almost zero just before
we have faliure!
I have also tried a 'netstat -vat' which shows a crazy number of
'ESTABLISHED' connections to the DB, even though I have been meticulous
about setting to null whenever I can. (All I can do in Java).
I have tried enforcing Java Garbage Collection during the process, but all
this does is slow the process down so it takes longer before it crashes with
the same error????
Any thoughts about these 'file handle' things and what setting I can safely
use in /proc/sys/fs/file-max ???
Cheers muchly,
Dominic
More information about the Sussex
mailing list