You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Devl Devel <de...@gmail.com> on 2015/09/19 21:30:16 UTC

SparkR installation not working

Hi All,

I've built spark 1.5.0 with hadoop 2.6 with a fresh download :

build/mvn  -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean package

I try to run SparkR it launches the normal R without the spark addons:

./bin/sparkR --master local[*]
Picked up JAVA_TOOL_OPTIONS: -javaagent:/usr/share/java/jayatanaag.jar

R version 3.1.2 (2014-10-31) -- "Pumpkin Helmet"
Copyright (C) 2014 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)

R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.

  Natural language support but running in an English locale

R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.

Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.

>

With no "Welcome to SparkR"

also

> sc <- sparkR.init()
Error: could not find function "sparkR.init"
> sqlContext <- sparkRSQL.init(sc)
Error: could not find function "sparkRSQL.init"
>

Spark-shell and other components are fine. Using scala 2.10.6 and Java
1.8_45, Ubuntu 15.0.4. Please can anyone give me any pointers? Is there a
spark maven profile I need to enable?

Thanks
Devl

Re: SparkR installation not working

Posted by Ted Yu <yu...@gmail.com>.
Looks like you didn't specify sparkr profile when building.

Cheers

On Sat, Sep 19, 2015 at 12:30 PM, Devl Devel <de...@gmail.com>
wrote:

> Hi All,
>
> I've built spark 1.5.0 with hadoop 2.6 with a fresh download :
>
> build/mvn  -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean package
>
> I try to run SparkR it launches the normal R without the spark addons:
>
> ./bin/sparkR --master local[*]
> Picked up JAVA_TOOL_OPTIONS: -javaagent:/usr/share/java/jayatanaag.jar
>
> R version 3.1.2 (2014-10-31) -- "Pumpkin Helmet"
> Copyright (C) 2014 The R Foundation for Statistical Computing
> Platform: x86_64-pc-linux-gnu (64-bit)
>
> R is free software and comes with ABSOLUTELY NO WARRANTY.
> You are welcome to redistribute it under certain conditions.
> Type 'license()' or 'licence()' for distribution details.
>
>   Natural language support but running in an English locale
>
> R is a collaborative project with many contributors.
> Type 'contributors()' for more information and
> 'citation()' on how to cite R or R packages in publications.
>
> Type 'demo()' for some demos, 'help()' for on-line help, or
> 'help.start()' for an HTML browser interface to help.
> Type 'q()' to quit R.
>
> >
>
> With no "Welcome to SparkR"
>
> also
>
> > sc <- sparkR.init()
> Error: could not find function "sparkR.init"
> > sqlContext <- sparkRSQL.init(sc)
> Error: could not find function "sparkRSQL.init"
> >
>
> Spark-shell and other components are fine. Using scala 2.10.6 and Java
> 1.8_45, Ubuntu 15.0.4. Please can anyone give me any pointers? Is there a
> spark maven profile I need to enable?
>
> Thanks
> Devl
>