You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@netbeans.apache.org by "Shevek (JIRA)" <ji...@apache.org> on 2019/05/27 19:42:00 UTC
[jira] [Created] (NETBEANS-2606) NetBeans overallocating
FileObject(s), jams up GC, hangs
Shevek created NETBEANS-2606:
--------------------------------
Summary: NetBeans overallocating FileObject(s), jams up GC, hangs
Key: NETBEANS-2606
URL: https://issues.apache.org/jira/browse/NETBEANS-2606
Project: NetBeans
Issue Type: Bug
Affects Versions: 11.0
Reporter: Shevek
Attachments: histo.txt, temp.txt
Ubuntu 19.04
JDK1.8
JVM stuck in ergonomics, caused by too many FileObjects in memory.
10 threads sitting at 99% CPU, presumably mostly GC threads.
GUI not responding.
Note that this IDE JVM runs with 2.5Gb heap, and normally uses about 1.5Gb of that, so something has gone seriously wrong.
num #instances #bytes class name
----------------------------------------------
1: 8839529 424297392 org.netbeans.modules.masterfs.filebasedfs.fileobjects.BaseFileObj$FileEventImpl
2: 12383724 297209376 java.util.concurrent.ConcurrentLinkedQueue$Node
3: 5520360 220814400 org.openide.util.WeakListenerImpl$ListenerReference
4: 7963905 191133720 org.openide.filesystems.FCLSupport$DispatchEventWrapperSingle
5: 5608373 179467936 java.lang.ref.WeakReference
6: 5404854 172955328 org.openide.util.WeakListenerImpl$ProxyListener
7: 123474 111203424 [B
8: 4419767 106074408 org.openide.filesystems.FCLSupport$DispatchEventWrapperMulti
9: 5378746 86059936 com.sun.proxy.$Proxy1
10: 1929065 77162600 org.netbeans.modules.masterfs.filebasedfs.fileobjects.FileObj
11: 1992178 74673680 [Ljava.lang.Object;
12: 576632 74309864 [C
13: 1403506 33684144 org.openide.filesystems.EventControl$AtomicActionLink
14: 1940678 31050848 javax.swing.event.EventListenerList
15: 1403503 22456048 org.netbeans.modules.masterfs.filebasedfs.fileobjects.FileObjectFactory$AsyncRefreshAtomicAction
At the point where we have over 22 million atomic refresh actions on the queue, something has gone VERY WRONG - See NETBEANS-2291 - this particular project has only about 200,000 files in it, so where do we get 22 million file-refresh actions from? Backpressure on the queue, anyone?
Log messages:
WARNING [org.netbeans.core.TimableEventQueue]: too much time in AWT thread org.netbeans.modules.sampler.InternalSampler@55b26cab
WARNING [org.netbeans.core.TimableEventQueue]: too much time in AWT thread null
WARNING [org.netbeans.core.TimableEventQueue]: no snapshot taken
WARNING [org.netbeans.core.TimableEventQueue]: too much time in AWT thread org.netbeans.modules.sampler.InternalSampler@2768760d
WARNING [org.netbeans.core.TimableEventQueue]: no snapshot taken
WARNING [org.netbeans.core.TimableEventQueue]: too much time in AWT thread org.netbeans.modules.sampler.InternalSampler@63f25c86
WARNING [org.netbeans.core.TimableEventQueue]: no snapshot taken
WARNING [org.netbeans.core.TimableEventQueue]: too much time in AWT thread org.netbeans.modules.sampler.InternalSampler@34933167
WARNING [org.netbeans.core.TimableEventQueue]: no snapshot taken
WARNING [org.netbeans.core.TimableEventQueue]: too much time in AWT thread org.netbeans.modules.sampler.InternalSampler@75ba9419
WARNING [org.netbeans.core.TimableEventQueue]: no snapshot taken
WARNING [org.netbeans.core.TimableEventQueue]: too much time in AWT thread org.netbeans.modules.sampler.InternalSampler@1c699d1e
WARNING [org.netbeans.core.TimableEventQueue]: no snapshot taken
WARNING [org.netbeans.core.TimableEventQueue]: too much time in AWT thread org.netbeans.modules.sampler.InternalSampler@12ae60e3
WARNING [org.netbeans.core.TimableEventQueue]: too much time in AWT thread null
jstack attached.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@netbeans.apache.org
For additional commands, e-mail: commits-help@netbeans.apache.org
For further information about the NetBeans mailing lists, visit:
https://cwiki.apache.org/confluence/display/NETBEANS/Mailing+lists