You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Gary Malouf <ma...@gmail.com> on 2014/07/25 20:27:56 UTC

Kryo Issue on Spark 1.0.1, Mesos 0.18.2

After upgrading to Spark 1.0.1 from 0.9.1 everything seemed to be going
well.  Looking at the Mesos slave logs, I noticed:

ERROR KryoSerializer: Failed to run spark.kryo.registrator
java.lang.ClassNotFoundException:
com/mediacrossing/verrazano/kryo/MxDataRegistrator

My spark-env.sh has the following when I run the Spark Shell:

export MESOS_NATIVE_LIBRARY=/usr/local/lib/libmesos.so

export MASTER=mesos://zk://n-01:2181,n-02:2181,n-03:2181/masters

export ADD_JARS=/opt/spark/mx-lib/verrazano-assembly.jar


# -XX:+UseCompressedOops must be disabled to use more than 32GB RAM

SPARK_JAVA_OPTS="-Xss2m -XX:+UseCompressedOops
-Dspark.local.dir=/opt/mesos-tmp -Dspark.executor.memory=4g
 -Dspark.serializer=org.apache.spark.serializer.KryoSerializer
-Dspark.kryo.registrator=com.mediacrossing.verrazano.kryo.MxDataRegistrator
-Dspark.kryoserializer.buffer.mb=16 -Dspark.akka.askTimeout=30"


I was able to verify that our custom jar was being copied to each worker,
but for some reason it is not finding my registrator class.  Is anyone else
struggling with Kryo on 1.0.x branch?

Re: Kryo Issue on Spark 1.0.1, Mesos 0.18.2

Posted by Gary Malouf <ma...@gmail.com>.
Maybe this is me misunderstanding the Spark system property behavior, but
I'm not clear why the class being loaded ends up having '/' rather than '.'
in it's fully qualified name.  When I tested this out locally, the '/' were
preventing the class from being loaded.


On Fri, Jul 25, 2014 at 2:27 PM, Gary Malouf <ma...@gmail.com> wrote:

> After upgrading to Spark 1.0.1 from 0.9.1 everything seemed to be going
> well.  Looking at the Mesos slave logs, I noticed:
>
> ERROR KryoSerializer: Failed to run spark.kryo.registrator
> java.lang.ClassNotFoundException:
> com/mediacrossing/verrazano/kryo/MxDataRegistrator
>
> My spark-env.sh has the following when I run the Spark Shell:
>
> export MESOS_NATIVE_LIBRARY=/usr/local/lib/libmesos.so
>
> export MASTER=mesos://zk://n-01:2181,n-02:2181,n-03:2181/masters
>
> export ADD_JARS=/opt/spark/mx-lib/verrazano-assembly.jar
>
>
> # -XX:+UseCompressedOops must be disabled to use more than 32GB RAM
>
> SPARK_JAVA_OPTS="-Xss2m -XX:+UseCompressedOops
> -Dspark.local.dir=/opt/mesos-tmp -Dspark.executor.memory=4g
>  -Dspark.serializer=org.apache.spark.serializer.KryoSerializer
> -Dspark.kryo.registrator=com.mediacrossing.verrazano.kryo.MxDataRegistrator
> -Dspark.kryoserializer.buffer.mb=16 -Dspark.akka.askTimeout=30"
>
>
> I was able to verify that our custom jar was being copied to each worker,
> but for some reason it is not finding my registrator class.  Is anyone else
> struggling with Kryo on 1.0.x branch?
>

Re: Kryo Issue on Spark 1.0.1, Mesos 0.18.2

Posted by Gary Malouf <ma...@gmail.com>.
Maybe this is me misunderstanding the Spark system property behavior, but
I'm not clear why the class being loaded ends up having '/' rather than '.'
in it's fully qualified name.  When I tested this out locally, the '/' were
preventing the class from being loaded.


On Fri, Jul 25, 2014 at 2:27 PM, Gary Malouf <ma...@gmail.com> wrote:

> After upgrading to Spark 1.0.1 from 0.9.1 everything seemed to be going
> well.  Looking at the Mesos slave logs, I noticed:
>
> ERROR KryoSerializer: Failed to run spark.kryo.registrator
> java.lang.ClassNotFoundException:
> com/mediacrossing/verrazano/kryo/MxDataRegistrator
>
> My spark-env.sh has the following when I run the Spark Shell:
>
> export MESOS_NATIVE_LIBRARY=/usr/local/lib/libmesos.so
>
> export MASTER=mesos://zk://n-01:2181,n-02:2181,n-03:2181/masters
>
> export ADD_JARS=/opt/spark/mx-lib/verrazano-assembly.jar
>
>
> # -XX:+UseCompressedOops must be disabled to use more than 32GB RAM
>
> SPARK_JAVA_OPTS="-Xss2m -XX:+UseCompressedOops
> -Dspark.local.dir=/opt/mesos-tmp -Dspark.executor.memory=4g
>  -Dspark.serializer=org.apache.spark.serializer.KryoSerializer
> -Dspark.kryo.registrator=com.mediacrossing.verrazano.kryo.MxDataRegistrator
> -Dspark.kryoserializer.buffer.mb=16 -Dspark.akka.askTimeout=30"
>
>
> I was able to verify that our custom jar was being copied to each worker,
> but for some reason it is not finding my registrator class.  Is anyone else
> struggling with Kryo on 1.0.x branch?
>