You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by unk1102 <um...@gmail.com> on 2015/07/21 16:57:40 UTC

SparkR sqlContext or sc not found in RStudio

Hi I could successfully install SparkR package into my RStudio but I could
not execute anything against sc or sqlContext. I did the following:

Sys.setenv(SPARK_HOME="/path/to/sparkE1.4.1")
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"),"R","lib"),.libPaths()))
library(SparkR)

Above code installs packages and when I type the following I get Spark
references which shows my installation is correct

> sc
Java ref type org.apache.spark.api.java.JavaSparkContext id 0

> sparkSql.init(sc)
Java ref type org.apache.spark.sql.SQLContext id 3
But when I try to execute anything against sc or sqlContext it says object
not found. For e.g.

> df < createDataFrame(sqlContext,"faithful")
It fails saying sqlContext not found. Dont know what is wrong with the setup
please guide. Thanks in advance.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-sqlContext-or-sc-not-found-in-RStudio-tp23928.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: SparkR sqlContext or sc not found in RStudio

Posted by harirajaram <ha...@gmail.com>.
Yep,I saw that in your previous post and I thought it was a typing mistake
that you did while posting,I never imagined that it was done on R
studio.Glad it worked.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-sqlContext-or-sc-not-found-in-RStudio-tp23928p23941.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: SparkR sqlContext or sc not found in RStudio

Posted by harirajaram <ha...@gmail.com>.
I'm sorry, I have no idea why it is failing on your side.I have been using
this for a while now and it works fine.All I can say is use version 1.4.0
but I don't think so it is going to make a big difference.This is the one
which I use,a/b are my directories.

Sys.setenv(SPARK_HOME="/a/b/spark-1.4.0")
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
library(SparkR)
sc <- sparkR.init(master="local")
sqlContext <- sparkRSQL.init(sc) 

Well,I'm going to ask another basic question,did you try some other version
before from amplab github etc..
Can u remove the package remove.packages("SparkR") and run install-dev.sh
from R folder of your spark_home and then try again to see if it
works..Hopefully,it should work.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-sqlContext-or-sc-not-found-in-RStudio-tp23928p23938.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: SparkR sqlContext or sc not found in RStudio

Posted by unk1102 <um...@gmail.com>.
Hi thanks for the reply. I did download from github build it and it is
working fine I can use spark-submit etc when I use it in RStudio I dont know
why it is saying sqlContext not found

When I do the following

> sqlContext < sparkRSQL.init(sc)
Error: object sqlContext not found

if I do the following 

> sparkRSQL.init(sc)
Java ref type org.apache.spark.sql.SQLContext id 3

I dont know whats wrong here.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-sqlContext-or-sc-not-found-in-RStudio-tp23928p23931.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: SparkR sqlContext or sc not found in RStudio

Posted by harirajaram <ha...@gmail.com>.
I'm assuming you are building sparkR from github for apache/spark and not
github of amplab.
If that is correct,I don't see you intializing sqlcontext like this
sqlContext <- sparkRSQL.init(sc)..
If you have done both, then I don't have an idea as it is working fine for
me.



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-sqlContext-or-sc-not-found-in-RStudio-tp23928p23929.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org