You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Antonio Piccolboni (JIRA)" <ji...@apache.org> on 2015/04/11 01:26:12 UTC

[jira] [Created] (SPARK-6853) Contents of .globalenv in workers

Antonio Piccolboni created SPARK-6853:
-----------------------------------------

             Summary: Contents of .globalenv in workers
                 Key: SPARK-6853
                 URL: https://issues.apache.org/jira/browse/SPARK-6853
             Project: Spark
          Issue Type: Bug
          Components: SparkR
    Affects Versions: 1.3.0
            Reporter: Antonio Piccolboni


I am filling a number of bugs that may be all related but I am not sure and I don't want to forget the test case, so here it is:

In an R --vanilla session

library(SparkR)
sc=sparkR.init()
z=1
eval(quote(z), .GlobalEnv)
collect(lapply(parallelize(sc, 1), function(i) eval(quote(z), .GlobalEnv)))

Fails not finding z.

Unfortunately .GlobalEnv gets special treatment (a.k.a the wrong one) when closures are serialized in R and gets replaced by whatever the .Globalenv is at deserialization. It's not so far fetched as a use case since I got here trying to make the foreach package work on Spark.




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org