You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Joe Near (JIRA)" <ji...@apache.org> on 2015/08/05 01:17:04 UTC

[jira] [Created] (SPARK-9621) Closure inside RDD doesn't properly close over environment

Joe Near created SPARK-9621:
-------------------------------

             Summary: Closure inside RDD doesn't properly close over environment
                 Key: SPARK-9621
                 URL: https://issues.apache.org/jira/browse/SPARK-9621
             Project: Spark
          Issue Type: Bug
    Affects Versions: 1.4.1
         Environment: Ubuntu 15.04, spark-1.4.1-bin-hadoop2.6 package
            Reporter: Joe Near


I expect the following:

case class MyTest(i: Int)
val tv = MyTest(1)
val res = sc.parallelize(Array((t: MyTest) => t == tv)).first()(tv)

to be "true." It is "false," when I type this into spark-shell. It seems the closure is changed somehow when it's serialized and deserialized.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org