You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Gurvinder Singh <gu...@uninett.no> on 2014/07/01 09:28:23 UTC

issue with running example code

Hi,

I am having issue in running scala example code. I have tested and able
to run successfully python example code, but when I run the scala code I
get this error

java.lang.ClassCastException: cannot assign instance of
org.apache.spark.examples.SparkPi$$anonfun$1 to field
org.apache.spark.rdd.MappedRDD.f of type scala.Function1 in instance of
org.apache.spark.rdd.MappedRDD

I have compiled spark from the github directly and running with the
command as

spark-submit /usr/share/spark/lib/spark-examples_2.10-1.1.0-SNAPSHOT.jar
--class org.apache.spark.examples.SparkPi 5 --jars
/usr/share/spark/lib/spark-assembly-1.1.0-SNAPSHOT-hadoop2.h5.0.1.jar

Any suggestions will be helpful.

Thanks,
Gurvinder

Re: issue with running example code

Posted by Gurvinder Singh <gu...@uninett.no>.
In the end it turns out that the issue was caused by a config settings
in spark-defaults.conf. After removing this setting

spark.files.userClassPathFirst   true

things are back to normal. Just reporting in case f someone will have
the same issue.

- Gurvinder
On 07/03/2014 06:49 PM, Gurvinder Singh wrote:
> Just to provide more information on this issue. It seems that SPARK_HOME
> environment variable is causing the issue. If I unset the variable in
> spark-class script and run in the local mode my code runs fine without
> the exception. But if I run with SPARK_HOME, I get the exception
> mentioned below. I could run without setting SPARK_HOME but it is not
> possible to run in the cluster settings, as this tells where is spark on
> worker nodes. E.g. we are using Mesos as cluster manager, thus when set
> master to mesos we get the exception as SPARK_HOME is not set.
> 
> Just to mention again the pyspark works fine as well as spark-shell,
> only when we are running compiled jar it seems SPARK_HOME causes some
> java run time issues that we get class cast exception.
> 
> Thanks,
> Gurvinder
> On 07/01/2014 09:28 AM, Gurvinder Singh wrote:
>> Hi,
>>
>> I am having issue in running scala example code. I have tested and able
>> to run successfully python example code, but when I run the scala code I
>> get this error
>>
>> java.lang.ClassCastException: cannot assign instance of
>> org.apache.spark.examples.SparkPi$$anonfun$1 to field
>> org.apache.spark.rdd.MappedRDD.f of type scala.Function1 in instance of
>> org.apache.spark.rdd.MappedRDD
>>
>> I have compiled spark from the github directly and running with the
>> command as
>>
>> spark-submit /usr/share/spark/lib/spark-examples_2.10-1.1.0-SNAPSHOT.jar
>> --class org.apache.spark.examples.SparkPi 5 --jars
>> /usr/share/spark/lib/spark-assembly-1.1.0-SNAPSHOT-hadoop2.h5.0.1.jar
>>
>> Any suggestions will be helpful.
>>
>> Thanks,
>> Gurvinder
>>
> 


Re: issue with running example code

Posted by Gurvinder Singh <gu...@uninett.no>.
Just to provide more information on this issue. It seems that SPARK_HOME
environment variable is causing the issue. If I unset the variable in
spark-class script and run in the local mode my code runs fine without
the exception. But if I run with SPARK_HOME, I get the exception
mentioned below. I could run without setting SPARK_HOME but it is not
possible to run in the cluster settings, as this tells where is spark on
worker nodes. E.g. we are using Mesos as cluster manager, thus when set
master to mesos we get the exception as SPARK_HOME is not set.

Just to mention again the pyspark works fine as well as spark-shell,
only when we are running compiled jar it seems SPARK_HOME causes some
java run time issues that we get class cast exception.

Thanks,
Gurvinder
On 07/01/2014 09:28 AM, Gurvinder Singh wrote:
> Hi,
> 
> I am having issue in running scala example code. I have tested and able
> to run successfully python example code, but when I run the scala code I
> get this error
> 
> java.lang.ClassCastException: cannot assign instance of
> org.apache.spark.examples.SparkPi$$anonfun$1 to field
> org.apache.spark.rdd.MappedRDD.f of type scala.Function1 in instance of
> org.apache.spark.rdd.MappedRDD
> 
> I have compiled spark from the github directly and running with the
> command as
> 
> spark-submit /usr/share/spark/lib/spark-examples_2.10-1.1.0-SNAPSHOT.jar
> --class org.apache.spark.examples.SparkPi 5 --jars
> /usr/share/spark/lib/spark-assembly-1.1.0-SNAPSHOT-hadoop2.h5.0.1.jar
> 
> Any suggestions will be helpful.
> 
> Thanks,
> Gurvinder
>