You are viewing a plain text version of this content. The canonical link for it is here.
Posted to notifications@skywalking.apache.org by GitBox <gi...@apache.org> on 2020/04/02 15:58:01 UTC

[GitHub] [skywalking] darcydai opened a new issue #4605: skywalking recevier print lots of grpc timeout log

darcydai opened a new issue #4605: skywalking recevier print  lots of grpc timeout log
URL: https://github.com/apache/skywalking/issues/4605
 
 
   Please answer these questions before submitting your issue.
   
   - Why do you submit this issue?
   - [X] Question or discussion
   - [ ] Bug
   - [ ] Requirement
   - [ ] Feature or performance improvement
   
   ___
   ### Question
   I am using skywalking in product environment,  this are 9 receivers and 9 aggregators in skywalking cluster and the elastic search cluster has 12 nodes, every node has 24 CPU and every node`s heap memory is 30G. For a long time, the keyword  'DEADLINE_EXCEEDED' appeared in **receiver log** (Notice: that keyword appeared in the receiver and class name is "GRPCRemoteClient", the previous issue is that this key appears in agent log.).  
   
   The reason for this, I guess is that aggregator has a long response time for receiver.Can I solve this problem by  making aggregators a larger scale? thanks so much!
   
   
   ___
   ### Requirement or improvement
   the log print by  `org.apache.skywalking.oap.server.core.remote.client.GRPCRemoteClient `
   
   ```
   2020-04-02 23:21:27,986 - org.apache.skywalking.oap.server.core.remote.client.GRPCRemoteClient - 192 [grpc-default-executor-246] ERROR [] - DEADLINE_EXCEEDED: deadline exceeded after 19999965479ns
   io.grpc.StatusRuntimeException: DEADLINE_EXCEEDED: deadline exceeded after 19999965479ns
   	at io.grpc.Status.asRuntimeException(Status.java:526) ~[grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.stub.ClientCalls$StreamObserverToCallListenerAdapter.onClose(ClientCalls.java:434) [grpc-stub-1.15.1.jar:1.15.1]
   	at io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39) [grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23) [grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40) [grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.internal.CensusStatsModule$StatsClientInterceptor$1$1.onClose(CensusStatsModule.java:678) [grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.PartialForwardingClientCallListener.onClose(PartialForwardingClientCallListener.java:39) [grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.ForwardingClientCallListener.onClose(ForwardingClientCallListener.java:23) [grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.ForwardingClientCallListener$SimpleForwardingClientCallListener.onClose(ForwardingClientCallListener.java:40) [grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.internal.CensusTracingModule$TracingClientInterceptor$1$1.onClose(CensusTracingModule.java:403) [grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.internal.ClientCallImpl.closeObserver(ClientCallImpl.java:459) [grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.internal.ClientCallImpl.access$300(ClientCallImpl.java:63) [grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.close(ClientCallImpl.java:546) [grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl.access$600(ClientCallImpl.java:467) [grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.internal.ClientCallImpl$ClientStreamListenerImpl$1StreamClosed.runInContext(ClientCallImpl.java:584) [grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37) [grpc-core-1.15.1.jar:1.15.1]
   	at io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123) [grpc-core-1.15.1.jar:1.15.1]
   	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_101]
   	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_101]
   	at java.lang.Thread.run(Thread.java:745) [?:1.8.0_101]
   ```
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [skywalking] darcydai commented on issue #4605: skywalking recevier print lots of grpc timeout log

Posted by GitBox <gi...@apache.org>.
darcydai commented on issue #4605: skywalking recevier print  lots of grpc timeout log
URL: https://github.com/apache/skywalking/issues/4605#issuecomment-607937190
 
 
   At the beginningļ¼Œ some elastic search reject exception appeared in my cluster log, i expand the elastic search cluster and adjust same params ,then the elastic search reject exception was no longer appearing
   
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [skywalking] darcydai commented on issue #4605: skywalking recevier print lots of grpc timeout log

Posted by GitBox <gi...@apache.org>.
darcydai commented on issue #4605: skywalking recevier print  lots of grpc timeout log
URL: https://github.com/apache/skywalking/issues/4605#issuecomment-608192619
 
 
   ok, thanks so mach. 

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [skywalking] darcydai commented on issue #4605: skywalking recevier print lots of grpc timeout log

Posted by GitBox <gi...@apache.org>.
darcydai commented on issue #4605: skywalking recevier print  lots of grpc timeout log
URL: https://github.com/apache/skywalking/issues/4605#issuecomment-607949789
 
 
   sorry, I do not understand observability mean what. setup telemetry,  then use the prometheus to collect metrics?
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [skywalking] darcydai closed issue #4605: skywalking recevier print lots of grpc timeout log

Posted by GitBox <gi...@apache.org>.
darcydai closed issue #4605: skywalking recevier print  lots of grpc timeout log
URL: https://github.com/apache/skywalking/issues/4605
 
 
   

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [skywalking] wu-sheng commented on issue #4605: skywalking recevier print lots of grpc timeout log

Posted by GitBox <gi...@apache.org>.
wu-sheng commented on issue #4605: skywalking recevier print  lots of grpc timeout log
URL: https://github.com/apache/skywalking/issues/4605#issuecomment-607939471
 
 
   Basically, your thoughts are right. Aggregators do not have enough performance. This could caused by es flush performance or oap overload.
   
   Do you setup the observability for OAP servers? You could have es flush time on that.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

[GitHub] [skywalking] wu-sheng commented on issue #4605: skywalking recevier print lots of grpc timeout log

Posted by GitBox <gi...@apache.org>.
wu-sheng commented on issue #4605: skywalking recevier print  lots of grpc timeout log
URL: https://github.com/apache/skywalking/issues/4605#issuecomment-608135477
 
 
   Yes.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services