This Bugzilla instance is a read-only archive of historic NetBeans bug reports. To report a bug in NetBeans please follow the project's instructions for reporting issues.
040719 and 040720. I have periodically gotten IOException subclasses (from various code) saying that I have too many open files. (Linux, 1.5.0 b58) Looking in the Linux memory map for the java process doesn't show any obviously incorrect open mapped files; I don't think this shows open file handles used for streams (and I am not sure how to trace those anyway). At first I thought this might be a problem with the newly integrated Ant 1.6.2, but I have also gotten this in IDE sessions where I have not run Ant at all; so I am guessing that the problem arose due to some unclosed streams used during parsing or scanning. I have also observed OOMEs or persistently full heap during routine Find Usages calls, which might be related.
Can't find any applicable recent changes in mdr, refactoring, or java modules...
OK, after running for a while with a patch to show file open/close events, doesn't seem to have to do with javacore scanning. But I don't see any reason for it either.
> I don't think this shows open file handles used for streams > (and I am not sure how to trace those anyway). cd /proc/$PID/fd you should find there all opened fdescs for given process. Although java process span several OS PIDs, all should have the same set of fd's reported in the proc fs.
Created attachment 16395 [details] GZip'd log of session when I ran out of handles; attempted logging to find file open/close events
Has not happened to me again. If it does, will check /proc. By the way Martin: I did check /proc for a process that had not run out of file handles (yet?) and I saw that nearly all file handles were used by mdrstorage random access files. Maybe these could be consolidated into a smaller (bounded) number of files? I think it is very unwise to keep open a potentially unlimited number of RandomAccessFile's (proportional to the number of source trees you have encountered, 3x I think).
Saw this again (040724). Will attach contents of /proc. 237 out of 271 file descriptors are in mdrstorage.
Created attachment 16464 [details] List of open files
Seems to me that this is too much. I am not sure why sometimes I run out of file handles and sometimes I don't.
No particular reason to believe file handles are "leaking" as such.
Jesse, your list of opened files shows, that following files are opened: /space/src/jdk142src/j2se/src/share/classes/java/awt/AWTEvent.java /space/src/jdk142src/j2se/src/share/classes/java/awt/Color.java /space/src/jdk142src/j2se/src/share/classes/java/awt/Color.java /space/src/jdk142src/j2se/src/share/classes/java/awt/ComponentOrientation.java /space/src/jdk142src/j2se/src/share/classes/java/awt/Container.java /space/src/jdk142src/j2se/src/share/classes/java/awt/Container.java /space/src/jdk142src/j2se/src/share/classes/java/awt/Container.java /space/src/jdk142src/j2se/src/share/classes/java/awt/event/KeyEvent.java /space/src/jdk142src/j2se/src/share/classes/java/awt/Font.java /space/src/jdk142src/j2se/src/share/classes/java/awt/font/TextLayout.java /space/src/jdk142src/j2se/src/share/classes/java/awt/geom/AffineTransform.java /space/src/jdk142src/j2se/src/share/classes/java/awt/geom/Rectangle2D.java /space/src/jdk142src/j2se/src/share/classes/java/io/DataOutputStream.java /space/src/jdk142src/j2se/src/share/classes/java/lang/Package.java /space/src/jdk142src/j2se/src/share/classes/javax/swing/JRootPane.java Did you have these files opened in the Editor? Thanks
No, I did not. file:/space/src/jdk142src/j2se/src/share/classes/ is registered as the sources for the Java platform in file:/space/jdk1.4.2_04/ however. I was almost surely not doing anything particularly involving those classes; for example AffineTransform is not a class I have ever (explicitly) used in my life.
271 is not that much. Please check your ulimit (note that the fdesc limit is shared between all of your processes, so it may happen that some other process is eating the rest of your fdescs) OTOH my default ulimit is 1024, which means at most 3 instances of NB and nothing else. Opened JDK sources? Maybe the parser is not closing them properly and they are randomly closed just by stream finalizers...
Typing "ulimit" in a shell reports "unlimited". "ulimit -n" reports "1024". Currently (NB running) I have 891 entries in /proc/*/fd/*, incl. many sockets and pipes. Perhaps adding another 150 or so file descriptors from NB while parsing is putting it over the limit?
Closing the storage streams while they are in use will be problematic. Anyway I think that having 96 classpath elements open (like Jesse) is not a very typical usecase, in addition I believe the limit is per process. I tested this on my machine (with Fedora 1). My ulimit says the same thing as Jesse's (1024). So I tried to run 4 instances of NetBeans with almost all NetBeans sources open in each of them. Counting all fd's using "find /proc | fgrep '/fd/' | wc -l" gave me 21848. So I tried to dereference all links and remove all duplicates. Doing this using "echo /proc/*/fd/* | xargs ls -l | awk '{ print $11 }' | sort | uniq | wc -l" returned 1426 - still way over the limit. So I think the limit is per process thus 200 fd's should still not be a huge problem unless there is a leak.
Re. limit being per-process: perhaps so, I have no idea. Re. having 96 classpath elements open not being a typical use case: why not?? I have about ten NBM projects open. That's it. We should be able to comfortably support having all NB modules open at once, IMHO. So still no real progress in finding the cause of this, I guess. Any other reports of people getting "too many open files" errors? It continues to happen to me from time to time; not reproducibly.
Haven't seen this error recently, FWIW.
Closing as fixed. It was caused by issue 48858 and issue 48466.
Created attachment 22234 [details] Happened to me again opening nbbuild/misc using a dev build... so not fixed after all? Note complete corruption of everything after it is thrown.