You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jacques BasaldĂșa <ja...@dybot.com> on 2014/03/24 00:48:45 UTC

Problem with SparkR

I am really interested in using Spark from R and have tried to use SparkR,
but always get the same error.

 

This is how I installed:

 

 - I successfully installed Spark version  0.9.0 with Scala  2.10.3 (OpenJDK
64-Bit Server VM, Java 1.7.0_45)

   I can run examples from spark-shell and Python

 

 - I installed the R package devtools and installed SparkR using:

 

 - library(devtools)

 - install_github("amplab-extras/SparkR-pkg", subdir="pkg")

 

  This compiled the package successfully.

  

When I try to run the package

 

E.g., 

  library(SparkR)

  sc <- sparkR.init(master="local")           //- so far the program runs
fine

                 

  rdd <- parallelize(sc, 1:10)  // This returns the following error

                

  Error in .jcall(getJRDD(rdd), "Ljava/util/List;", "collect") : 

  java.lang.IncompatibleClassChangeError:
org/apache/spark/util/InnerClosureFinder

 

No matter how I try to use the sc (I have tried all the examples) I always
get an error.

 

Any ideas?

 

Jacques.


Re: Problem with SparkR

Posted by Shivaram Venkataraman <sh...@eecs.berkeley.edu>.
Hi

Thanks for reporting this. It'll be great if you can check a couple of
things:

1. Are you trying to use this with Hadoop2 by any chance ? There was an
incompatible ASM version bug that we fixed for Hadoop2
https://github.com/amplab-extras/SparkR-pkg/issues/17 and we verified it,
but I just want to check if the same error is cropping up again.

2. Is there a stack trace that follows the IncompatibleClassChangeError ?
If so could you attach that ? The error message indicates there is some
incompatibility between class versions and having a more detailed stack
trace might help us track this down.

Thanks
Shivaram





On Sun, Mar 23, 2014 at 4:48 PM, Jacques BasaldĂșa <ja...@dybot.com> wrote:

>  I am really interested in using Spark from R and have tried to use
> SparkR, but always get the same error.
>
>
>
> This is how I installed:
>
>
>
>  - I successfully installed Spark version  0.9.0 with Scala  2.10.3
> (OpenJDK 64-Bit Server VM, Java 1.7.0_45)
>
>    I can run examples from spark-shell and Python
>
>
>
>  - I installed the R package devtools and installed SparkR using:
>
>
>
>  - library(devtools)
>
>  - install_github("amplab-extras/SparkR-pkg", subdir="pkg")
>
>
>
>   This compiled the package successfully.
>
>
>
> When I try to run the package
>
>
>
> E.g.,
>
>   library(SparkR)
>
>   sc <- sparkR.init(master="local")           //- so far the program runs
> fine
>
>
>
>   rdd <- parallelize(sc, 1:10)  // This returns the following error
>
>
>
>   Error in .jcall(getJRDD(rdd), "Ljava/util/List;", "collect") :
>
>   java.lang.IncompatibleClassChangeError:
> org/apache/spark/util/InnerClosureFinder
>
>
>
> No matter how I try to use the sc (I have tried all the examples) I always
> get an error.
>
>
>
> Any ideas?
>
>
>
> Jacques.
>