You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by klion26 <gi...@git.apache.org> on 2017/10/26 10:05:52 UTC

[GitHub] spark issue #9282: [SPARK-10986][Mesos] Set the context class loader in the ...

Github user klion26 commented on the issue:

    https://github.com/apache/spark/pull/9282
  
    received ClassNotFound error in Yarn-Cluster mode(spark 1.6.2),_doesn't reproduce the problem_
    The error message is such as below:
    
    ```
    [2017-10-26 16:53:18,274] ERROR Error while invoking RpcHandler#receive() for one-way message. (org.apache.spark.network.server.TransportRequestHandler)
    java.lang.ClassNotFoundException: org.apache.spark.rpc.RpcAddvess
    	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
    	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    	at java.security.AccessController.doPrivileged(Native Method)
    	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
    	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
    	at java.lang.Class.forName0(Native Method)
    	at java.lang.Class.forName(Class.java:274)
    	at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
    	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
    	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
    	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
    	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
    	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
    	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
    	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
    	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
    	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:109)
    	at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:267)
    	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
    	at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:319)
    	at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:266)
    	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
    	at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:265)
    	at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:597)
    	at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:586)
    	at org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:176)
    	at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:92)
    	at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:104)
    	at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
    	at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    	at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    	at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    	at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:86)
    	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
    	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
    	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
    	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
    	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
    	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
    	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    	at java.lang.Thread.run(Thread.java:745)
    ```


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org