You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:04:29 UTC

[jira] [Updated] (SPARK-24223) ExternalShuffleService looses registrations

     [ https://issues.apache.org/jira/browse/SPARK-24223?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-24223:
---------------------------------
    Labels: bulk-closed  (was: )

> ExternalShuffleService looses registrations
> -------------------------------------------
>
>                 Key: SPARK-24223
>                 URL: https://issues.apache.org/jira/browse/SPARK-24223
>             Project: Spark
>          Issue Type: Bug
>          Components: Mesos, Shuffle
>    Affects Versions: 2.1.0
>            Reporter: Igor Berman
>            Priority: Major
>              Labels: bulk-closed
>
> Hi, we have long running services that use dynamic allocation on Mesos(1.1.0)
> Services running periodic spark batch jobs with open spark context.
> However, there are sporadic failures caused by below error.
> Seems like one executor disconnects (?) from some external shuffle service and this fails everything. There is no possibility to recover/reregister executor after this. Usually restarting service helps(i.e. external shuffle services are up and running on all executors)
>  
> Might be connected to https://issues.apache.org/jira/browse/SPARK-9439 but for Mesos integration
> Any ideas will be welcome.
>  
> {code:java}
> org.apache.spark.SparkException: Job aborted due to stage failure: ShuffleMapStage 3820 () has failed the maximum allowable number of times: 4. Most recent failure reason: org.apache.spark.shuffle.FetchFailedException: Failure while fetching StreamChunkId{streamId=537714277840, chunkIndex=0}: java.lang.RuntimeException: Executor is not registered (appId=d265bc5e-cb4a-4f3e-88a2-aa79ca3a05fc-0067, execId=2943) 
> at org.apache.spark.network.shuffle.ExternalShuffleBlockResolver.getBloc kData(ExternalShuffleBlockResolver.java:171) 
> at org.apache.spark.network.shuffle.ExternalShuffleBlockHandler$1.next(ExternalShuffleBlockHandler.java:105) 
> at org.apache.spark.network.shuffle.ExternalShuffleBlockHandler$1.next(ExternalShuffleBlockHandler.java:95
> at org.apache.spark.network.server.OneForOneStreamManager.getChunk(OneForOneStreamManager.java:89) 
> at org.apache.spark.network.server.TransportRequestHandler.processFetchRequest(TransportRequestHandler.java:125) 
> at org.apache.spark.network.server.TransportRequ estHandler.handle(TransportRequestHandler.java:103) 
> at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:3 57) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
> at io.netty.handler.timeout.IdleStateHan dler.channelRead(IdleStateHandler.java:287) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
> at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) 
> at io.netty.channel.AbstractChannelHandlerContext.invoke ChannelRead(AbstractChannelHandlerContext.java:357) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336 ) 
> at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
> at io.netty.channel.AbstractChannelHandlerContex t.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
> at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1 294) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
> at io.netty.channel.DefaultChannelPipeli ne.fireChannelRead(DefaultChannelPipeline.java:911) 
> at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) 
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643) at io.netty.channel.nio.NioEvent Loop.processSelectedKeysOptimized(NioEventLoop.java:566) 
> at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480) 
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442) 
> at io.netty.util.concurrent.SingleThreadEventExe cutor$2.run(SingleThreadEventExecutor.java:131) 
> at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) 
> at java.lang.Thread.run(Thread.java:748) 
> at org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:442) 
> at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:418) 
> at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:59) 
> at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:434) 
> at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:440) 
> at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408) 
> at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:32) 
> at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37) 
> at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408) 
> at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.agg_doAggregateWithKeys$(Unknown Source) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown Source) 
> at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) 
> at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$8$$anon$1.hasNext(WholeStageCodegenExec.scala:395) 
> at org.apache.spark.sql.execution.columnar.InMemoryRelation$$anonfun$1$$anon$1.hasNext(InMemoryRelation.scala:133) 
> at org.apache.spark.storage.memory.MemoryStore.putIteratorAsValues(MemoryStore.scala:215) 
> at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1038) 
> at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1029) 
> at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:969) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1029) 
> at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:760) 
> at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) 
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) 
> at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) 
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) 
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) 
> at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) 
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) 
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) 
> at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) 
> at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) 
> at org.apache.spark.rdd.RDD.iterator(RDD.scala:287) 
> at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96)
> at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53)
> at org.apache.spark.scheduler.Task.run(Task.scala:108) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:335) 
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
> at java.lang.Thread.run(Thread.java:748) Caused by: org.apache.spark.network.client.ChunkFetchFailureException: Failure while fetching StreamChunkId{streamId=537714277840, chunkIndex=0}: java.lang.RuntimeException: Executor is not registered (appId=d265bc5e-cb4a-4f3e-88a2-aa79ca3a05fc-0067, execId=2943) 
> at org.apache.spark.network.shuffle.ExternalShuffleBlockResolver.getBlockData(ExternalShuffleBlockResolver.java:171) 
> at org.apache.spark.network.shuffle.ExternalShuffleBlockHandler$1.next(ExternalShuffleBlockHandler.java:105) 
> at org.apache.spark.network.shuffle.ExternalShuffleBlockHandler$1.next(ExternalShuffleBlockHandler.java:95) 
> at org.apache.spark.network.server.OneForOneStreamManager.getChunk(OneForOneStreamManager.java:89) 
> at org.apache.spark.network.server.TransportRequestHandler.processFetchRequest(TransportRequestHandler.java:125) 
> at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:103) 
> at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
> at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
> at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
> at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
> at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
> at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) 
> at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) 
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566) 
> at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480) 
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442) 
> at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131) 
> at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) 
> at java.lang.Thread.run(Thread.java:748) 
> at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:182)
> at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:120) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
> at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
> at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
> at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
> at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:336) 
> at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:357) 
> at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:343) 
> at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911) 
> at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) 
> at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:643) at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:566) 
> at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:480) 
> at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:442) 
> at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131) 
> at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144) ... 1 more{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org