You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Emil Ejbyfeldt (Jira)" <ji...@apache.org> on 2023/04/16 13:15:00 UTC

[jira] [Commented] (SPARK-43138) ClassNotFoundException during RDD block replication/migration

    [ https://issues.apache.org/jira/browse/SPARK-43138?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17712781#comment-17712781 ] 

Emil Ejbyfeldt commented on SPARK-43138:
----------------------------------------

No. The class `com.class.from.user.jar.ClassName` is just a `case class` that is used inside an RDD. 

But I think I have a good idea of what is causing it. So I created this PR which I hope solves it: https://github.com/apache/spark/pull/40808

> ClassNotFoundException during RDD block replication/migration
> -------------------------------------------------------------
>
>                 Key: SPARK-43138
>                 URL: https://issues.apache.org/jira/browse/SPARK-43138
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 3.3.2, 3.4.0, 3.5.0
>            Reporter: Emil Ejbyfeldt
>            Priority: Major
>
> During RDD block migration during decommissioning we are seeing `ClassNotFoundException` on the receiving Executor. This seems to happen when the blocks contain classes that are from the user jars.
> ```
> 2023-04-08 04:15:11,791 ERROR server.TransportRequestHandler: Error while invoking RpcHandler#receive() on RPC id 6425687122551756860
> java.lang.ClassNotFoundException: com.class.from.user.jar.ClassName
>     at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:581)
>     at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:178)
>     at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
>     at java.base/java.lang.Class.forName0(Native Method)
>     at java.base/java.lang.Class.forName(Class.java:398)
>     at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:71)
>     at java.base/java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2003)
>     at java.base/java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1870)
>     at java.base/java.io.ObjectInputStream.readClass(ObjectInputStream.java:1833)
>     at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1658)
>     at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2496)
>     at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2390)
>     at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2228)
>     at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1687)
>     at java.base/java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2496)
>     at java.base/java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2390)
>     at java.base/java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2228)
>     at java.base/java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1687)
>     at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:489)
>     at java.base/java.io.ObjectInputStream.readObject(ObjectInputStream.java:447)
>     at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:87)
>     at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:123)
>     at org.apache.spark.network.netty.NettyBlockRpcServer.deserializeMetadata(NettyBlockRpcServer.scala:180)
>     at org.apache.spark.network.netty.NettyBlockRpcServer.receive(NettyBlockRpcServer.scala:119)
>     at org.apache.spark.network.server.TransportRequestHandler.processRpcRequest(TransportRequestHandler.java:163)
>     at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:109)
>     at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:140)
>     at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:53)
>     at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:99)
>     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>     at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
>     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>     at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
>     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>     at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:102)
>     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>     at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
>     at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
>     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
>     at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
>     at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
>     at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
>     at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722)
>     at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658)
>     at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584)
>     at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496)
>     at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986)
>     at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
>     at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
>     at java.base/java.lang.Thread.run(Thread.java:829)
> ```



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org