You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Armbrust (JIRA)" <ji...@apache.org> on 2014/06/12 21:13:01 UTC

[jira] [Updated] (SPARK-1199) Type mismatch in Spark shell when using case class defined in shell

     [ https://issues.apache.org/jira/browse/SPARK-1199?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Michael Armbrust updated SPARK-1199:
------------------------------------

    Priority: Blocker  (was: Critical)

> Type mismatch in Spark shell when using case class defined in shell
> -------------------------------------------------------------------
>
>                 Key: SPARK-1199
>                 URL: https://issues.apache.org/jira/browse/SPARK-1199
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 0.9.0
>            Reporter: Andrew Kerr
>            Priority: Blocker
>             Fix For: 1.1.0
>
>
> Define a class in the shell:
> {code}
> case class TestClass(a:String)
> {code}
> and an RDD
> {code}
> val data = sc.parallelize(Seq("a")).map(TestClass(_))
> {code}
> define a function on it and map over the RDD
> {code}
> def itemFunc(a:TestClass):TestClass = a
> data.map(itemFunc)
> {code}
> Error:
> {code}
> <console>:19: error: type mismatch;
>  found   : TestClass => TestClass
>  required: TestClass => ?
>               data.map(itemFunc)
> {code}
> Similarly with a mapPartitions:
> {code}
> def partitionFunc(a:Iterator[TestClass]):Iterator[TestClass] = a
> data.mapPartitions(partitionFunc)
> {code}
> {code}
> <console>:19: error: type mismatch;
>  found   : Iterator[TestClass] => Iterator[TestClass]
>  required: Iterator[TestClass] => Iterator[?]
> Error occurred in an application involving default arguments.
>               data.mapPartitions(partitionFunc)
> {code}
> The behavior is the same whether in local mode or on a cluster.
> This isn't specific to RDDs. A Scala collection in the Spark shell has the same problem.
> {code}
> scala> Seq(TestClass("foo")).map(itemFunc)
> <console>:15: error: type mismatch;
>  found   : TestClass => TestClass
>  required: TestClass => ?
>               Seq(TestClass("foo")).map(itemFunc)
>                                         ^
> {code}
> When run in the Scala console (not the Spark shell) there are no type mismatch errors.



--
This message was sent by Atlassian JIRA
(v6.2#6252)