You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2022/01/09 15:09:00 UTC
[jira] [Commented] (SPARK-37847) PushBlockStreamCallback should check isTooLate first to avoid NPE
[ https://issues.apache.org/jira/browse/SPARK-37847?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17471371#comment-17471371 ]
Apache Spark commented on SPARK-37847:
--------------------------------------
User 'pan3793' has created a pull request for this issue:
https://github.com/apache/spark/pull/35146
> PushBlockStreamCallback should check isTooLate first to avoid NPE
> -----------------------------------------------------------------
>
> Key: SPARK-37847
> URL: https://issues.apache.org/jira/browse/SPARK-37847
> Project: Spark
> Issue Type: Sub-task
> Components: Shuffle, Spark Core
> Affects Versions: 3.2.1, 3.3.0
> Reporter: Cheng Pan
> Priority: Major
>
> {code:java}
> 2022-01-07 21:06:14,464 INFO shuffle.RemoteBlockPushResolver: shuffle partition application_1640143179334_0149_-1 102 6922, chunk_size=1, meta_length=138, data_length=112632
> 2022-01-07 21:06:14,615 ERROR shuffle.RemoteBlockPushResolver: Encountered issue when merging shufflePush_102_0_279_6922
> java.lang.NullPointerException
> at org.apache.spark.network.shuffle.RemoteBlockPushResolver$AppShuffleMergePartitionsInfo.access$200(RemoteBlockPushResolver.java:1017)
> at org.apache.spark.network.shuffle.RemoteBlockPushResolver$PushBlockStreamCallback.isStale(RemoteBlockPushResolver.java:806)
> at org.apache.spark.network.shuffle.RemoteBlockPushResolver$PushBlockStreamCallback.onData(RemoteBlockPushResolver.java:840)
> at org.apache.spark.network.server.TransportRequestHandler$3.onData(TransportRequestHandler.java:209)
> at org.apache.spark.network.client.StreamInterceptor.handle(StreamInterceptor.java:79)
> at org.apache.spark.network.util.TransportFrameDecoder.feedInterceptor(TransportFrameDecoder.java:263)
> at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:87)
> at org.sparkproject.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> at org.sparkproject.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> at org.sparkproject.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
> at org.sparkproject.io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
> at org.sparkproject.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
> at org.sparkproject.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
> at org.sparkproject.io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
> at org.sparkproject.io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
> at org.sparkproject.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
> at org.sparkproject.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
> at org.sparkproject.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
> at org.sparkproject.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
> at org.sparkproject.io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
> at org.sparkproject.io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
> at org.sparkproject.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
> at java.lang.Thread.run(Thread.java:748)
> {code}
--
This message was sent by Atlassian Jira
(v8.20.1#820001)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org