You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Justin Lent <ju...@gmail.com> on 2014/01/29 17:51:55 UTC

SparkR dev preview package errors when other packages are loaded

Hey All,

I just joined this list so apologies if I come off as a bit of
newbie.... I'm super excited to be playing around with the developers
preview of SparkR in 0.9.  I've gotten it all working great in fact
(in RStudio even!)  However after I load some of my other favorite R
packages when I go and try to compute a basic RDD using one of the
examples in the parallelize() help I get the following error:

> rdd <- parallelize(sc, 1:10, 2)
Error in save(list = filteredVars, file = fileName, envir = closureEnv) :
  object 'NA' not found


This works just fine if I start up R and only load the SparkR package,
but like i said after i load a bunch of other libraries I end up
getting that error I pasted above

Thanks for your help!
-Justin

Re: SparkR dev preview package errors when other packages are loaded

Posted by Justin <ju...@gmail.com>.
So now this is really weird... I cannot reproduce this again -- even if I
load all those libraries and then create a SparkContext it works just fine
now. Geesh i hit it 10 times in a row earlier today, and but now can't
repro. oh well -- I'm happy its working though!

I guess we can "close" this issue for now.  



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-dev-preview-package-errors-when-other-packages-are-loaded-tp1020p1026.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: SparkR dev preview package errors when other packages are loaded

Posted by Justin <ju...@gmail.com>.
Yep yep... base::save seems like the way to go to force it to use the right one

thanks!

On Wed, Jan 29, 2014 at 12:36 PM, Shivaram Venkataraman-2 [via Apache
Spark User List] <ml...@n3.nabble.com> wrote:
> The problem is that some of the methods get masked by packages which
> are loaded. I think we can fix this by using 'base::save' instead of
> save in our package. I've filed this as
> https://github.com/amplab-extras/SparkR-pkg/issues/13
>
> Shivaram
>
> On Wed, Jan 29, 2014 at 11:30 AM, Justin <[hidden email]> wrote:
>
>> Thanks for the quick reply.
>>
>> Here are all the packages I'm trying to load in conjunction with SparkR --
>> I
>> have not yet attempted to step through them 1 by 1 to see which one is
>> causing the fail in the RDD computation/assignment/'save'...
>>
>>   require(quantmod)
>>   require(xts)
>>   require(ggplot2)
>>   require(ggthemes)
>>   require(PerformanceAnalytics)
>>   require(TTR)
>>   require(reshape2)
>>   require(RColorBrewer)
>>   require(gdata)
>>   require(urca)
>>   require(shiny)
>>   require(stringr)
>>   require(rbenchmark)
>>   require(timeDate)
>>   require(plyr)
>>   require(dplyr)
>>   require(gmodels)
>>   require(formula.tools)
>>   require(data.table)
>>   require(devtools)
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-dev-preview-package-errors-when-other-packages-are-loaded-tp1020p1023.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>
> ________________________________
> If you reply to this email, your message will be added to the discussion
> below:
> http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-dev-preview-package-errors-when-other-packages-are-loaded-tp1020p1025.html
> To unsubscribe from SparkR dev preview package errors when other packages
> are loaded, click here.
> NAML




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-dev-preview-package-errors-when-other-packages-are-loaded-tp1020p1027.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: SparkR dev preview package errors when other packages are loaded

Posted by Shivaram Venkataraman <sh...@eecs.berkeley.edu>.
The problem is that some of the methods get masked by packages which
are loaded. I think we can fix this by using 'base::save' instead of
save in our package. I've filed this as
https://github.com/amplab-extras/SparkR-pkg/issues/13

Shivaram

On Wed, Jan 29, 2014 at 11:30 AM, Justin <ju...@gmail.com> wrote:
> Thanks for the quick reply.
>
> Here are all the packages I'm trying to load in conjunction with SparkR -- I
> have not yet attempted to step through them 1 by 1 to see which one is
> causing the fail in the RDD computation/assignment/'save'...
>
>   require(quantmod)
>   require(xts)
>   require(ggplot2)
>   require(ggthemes)
>   require(PerformanceAnalytics)
>   require(TTR)
>   require(reshape2)
>   require(RColorBrewer)
>   require(gdata)
>   require(urca)
>   require(shiny)
>   require(stringr)
>   require(rbenchmark)
>   require(timeDate)
>   require(plyr)
>   require(dplyr)
>   require(gmodels)
>   require(formula.tools)
>   require(data.table)
>   require(devtools)
>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-dev-preview-package-errors-when-other-packages-are-loaded-tp1020p1023.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: SparkR dev preview package errors when other packages are loaded

Posted by Justin <ju...@gmail.com>.
Thanks for the quick reply. 

Here are all the packages I'm trying to load in conjunction with SparkR -- I
have not yet attempted to step through them 1 by 1 to see which one is
causing the fail in the RDD computation/assignment/'save'...

  require(quantmod)     
  require(xts)         
  require(ggplot2)      
  require(ggthemes)     
  require(PerformanceAnalytics)   
  require(TTR)         
  require(reshape2)     
  require(RColorBrewer) 
  require(gdata)        
  require(urca)       
  require(shiny)      
  require(stringr)    
  require(rbenchmark)  
  require(timeDate)   
  require(plyr)       
  require(dplyr)      
  require(gmodels)  
  require(formula.tools) 
  require(data.table)   
  require(devtools) 




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkR-dev-preview-package-errors-when-other-packages-are-loaded-tp1020p1023.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: SparkR dev preview package errors when other packages are loaded

Posted by ConcreteVitamin <co...@gmail.com>.
Hi Justin,

Can you let us know what libraries did you load?

Zongheng

On Wed, Jan 29, 2014 at 8:51 AM, Justin Lent <ju...@gmail.com> wrote:
> Hey All,
>
> I just joined this list so apologies if I come off as a bit of
> newbie.... I'm super excited to be playing around with the developers
> preview of SparkR in 0.9.  I've gotten it all working great in fact
> (in RStudio even!)  However after I load some of my other favorite R
> packages when I go and try to compute a basic RDD using one of the
> examples in the parallelize() help I get the following error:
>
>> rdd <- parallelize(sc, 1:10, 2)
> Error in save(list = filteredVars, file = fileName, envir = closureEnv) :
>   object 'NA' not found
>
>
> This works just fine if I start up R and only load the SparkR package,
> but like i said after i load a bunch of other libraries I end up
> getting that error I pasted above
>
> Thanks for your help!
> -Justin