You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Mullaivendhan Ariaputhri (Jira)" <ji...@apache.org> on 2020/01/30 07:24:00 UTC

[jira] [Updated] (SPARK-30675) Spark Streaming Job stopped reading events from Queue upon Deregister Exception

     [ https://issues.apache.org/jira/browse/SPARK-30675?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Mullaivendhan Ariaputhri updated SPARK-30675:
---------------------------------------------
    Description: 
*+Spark/EMR+*

From the driver logs, it has been found that the driver de-registered the receiver for the stream

 

 

*# Driver Logs*

2020-01-03 11:11:40 ERROR ReceiverTracker:70 - *{color:#de350b}Deregistered receiver for stream 0: Error while storing block into Spark - java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]{color}*

        at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)

        at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)

        at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:201)

        at org.apache.spark.streaming.receiver.{color:#de350b}*WriteAheadLogBasedBlockHandler.storeBlock*{color}(ReceivedBlockHandler.scala:210)

        at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:158)

        at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:129)

        at org.apache.spark.streaming.receiver.Receiver.store(Receiver.scala:133)

        at org.apache.spark.streaming.kinesis.KinesisReceiver.org$apache$spark$streaming$kinesis$KinesisReceiver$$storeBlockWithRanges(KinesisReceiver.scala:293)

        at org.apache.spark.streaming.kinesis.KinesisReceiver$GeneratedBlockHandler.onPushBlock(KinesisReceiver.scala:344)

        at org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:297)

        at org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:269)

        at org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:110)

        ...

*Till this point, there is no receiver being started/registered. From the executor logs (below), it has been observed that one of the executors was running on the container.*

 

*# Executer Logs* 

2020-01-03 11:11:30 INFO  BlockManager:54 - Removing RDD 2851002

2020-01-03 11:11:31 INFO  ReceiverSupervisorImpl:54 - {color:#de350b}*S**topping receiver with message: Error while storing block into Spark: java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]*{color}

2020-01-03 11:11:31 INFO  Worker:593 - Worker shutdown requested.

2020-01-03 11:11:31 INFO  LeaseCoordinator:298 - Worker ip-10-61-71-29.ap-southeast-2.compute.internal:a7567f14-16be-4aca-8f64-401b0b29aea2 has successfully stopped lease-tracking threads

2020-01-03 11:11:31 INFO  KinesisRecordProcessor:54 - Shutdown:  Shutting down workerId ip-10-61-71-29.ap-southeast-2.compute.internal:a7567f14-16be-4aca-8f64-401b0b29aea2 with reason ZOMBIE

2020-01-03 11:11:32 INFO  MemoryStore:54 - Block input-0-1575374565339 stored as bytes in memory (estimated size /7.3 KB, free 3.4 GB)

2020-01-03 11:11:33 INFO  Worker:634 - All record processors have been shut down successfully.

> Spark Streaming Job stopped reading events from Queue upon Deregister Exception
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-30675
>                 URL: https://issues.apache.org/jira/browse/SPARK-30675
>             Project: Spark
>          Issue Type: Bug
>          Components: Block Manager, Spark Submit, Structured Streaming
>    Affects Versions: 2.4.3
>            Reporter: Mullaivendhan Ariaputhri
>            Priority: Major
>
> *+Spark/EMR+*
> From the driver logs, it has been found that the driver de-registered the receiver for the stream
>  
>  
> *# Driver Logs*
> 2020-01-03 11:11:40 ERROR ReceiverTracker:70 - *{color:#de350b}Deregistered receiver for stream 0: Error while storing block into Spark - java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]{color}*
>         at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>         at scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>         at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:201)
>         at org.apache.spark.streaming.receiver.{color:#de350b}*WriteAheadLogBasedBlockHandler.storeBlock*{color}(ReceivedBlockHandler.scala:210)
>         at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushAndReportBlock(ReceiverSupervisorImpl.scala:158)
>         at org.apache.spark.streaming.receiver.ReceiverSupervisorImpl.pushArrayBuffer(ReceiverSupervisorImpl.scala:129)
>         at org.apache.spark.streaming.receiver.Receiver.store(Receiver.scala:133)
>         at org.apache.spark.streaming.kinesis.KinesisReceiver.org$apache$spark$streaming$kinesis$KinesisReceiver$$storeBlockWithRanges(KinesisReceiver.scala:293)
>         at org.apache.spark.streaming.kinesis.KinesisReceiver$GeneratedBlockHandler.onPushBlock(KinesisReceiver.scala:344)
>         at org.apache.spark.streaming.receiver.BlockGenerator.pushBlock(BlockGenerator.scala:297)
>         at org.apache.spark.streaming.receiver.BlockGenerator.org$apache$spark$streaming$receiver$BlockGenerator$$keepPushingBlocks(BlockGenerator.scala:269)
>         at org.apache.spark.streaming.receiver.BlockGenerator$$anon$1.run(BlockGenerator.scala:110)
>         ...
> *Till this point, there is no receiver being started/registered. From the executor logs (below), it has been observed that one of the executors was running on the container.*
>  
> *# Executer Logs* 
> 2020-01-03 11:11:30 INFO  BlockManager:54 - Removing RDD 2851002
> 2020-01-03 11:11:31 INFO  ReceiverSupervisorImpl:54 - {color:#de350b}*S**topping receiver with message: Error while storing block into Spark: java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]*{color}
> 2020-01-03 11:11:31 INFO  Worker:593 - Worker shutdown requested.
> 2020-01-03 11:11:31 INFO  LeaseCoordinator:298 - Worker ip-10-61-71-29.ap-southeast-2.compute.internal:a7567f14-16be-4aca-8f64-401b0b29aea2 has successfully stopped lease-tracking threads
> 2020-01-03 11:11:31 INFO  KinesisRecordProcessor:54 - Shutdown:  Shutting down workerId ip-10-61-71-29.ap-southeast-2.compute.internal:a7567f14-16be-4aca-8f64-401b0b29aea2 with reason ZOMBIE
> 2020-01-03 11:11:32 INFO  MemoryStore:54 - Block input-0-1575374565339 stored as bytes in memory (estimated size /7.3 KB, free 3.4 GB)
> 2020-01-03 11:11:33 INFO  Worker:634 - All record processors have been shut down successfully.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org