You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2014/10/31 10:56:33 UTC

[jira] [Created] (SPARK-4170) Closure problems when running Scala app that "extends App"

Sean Owen created SPARK-4170:
--------------------------------

             Summary: Closure problems when running Scala app that "extends App"
                 Key: SPARK-4170
                 URL: https://issues.apache.org/jira/browse/SPARK-4170
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.1.0
            Reporter: Sean Owen
            Priority: Minor


Michael Albert noted this problem on the mailing list (http://apache-spark-user-list.1001560.n3.nabble.com/BUG-when-running-as-quot-extends-App-quot-closures-don-t-capture-variables-td17675.html):

{code}
object DemoBug extends App {
    val conf = new SparkConf()
    val sc = new SparkContext(conf)

    val rdd = sc.parallelize(List("A","B","C","D"))
    val str1 = "A"

    val rslt1 = rdd.filter(x => { x != "A" }).count
    val rslt2 = rdd.filter(x => { str1 != null && x != "A" }).count
    
    println("DemoBug: rslt1 = " + rslt1 + " rslt2 = " + rslt2)
}
{code}

This produces the output:

{code}
DemoBug: rslt1 = 3 rslt2 = 0
{code}

If instead there is a proper "main()", it works as expected.


I also this week noticed that in a program which "extends App", some values were inexplicably null in a closure. When changing to use main(), it was fine.

I assume there is a problem with variables not being added to the closure when main() doesn't appear in the standard way.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org