You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Or (JIRA)" <ji...@apache.org> on 2015/12/21 23:11:46 UTC
[jira] [Resolved] (SPARK-2331) SparkContext.emptyRDD should return
RDD[T] not EmptyRDD[T]
[ https://issues.apache.org/jira/browse/SPARK-2331?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Andrew Or resolved SPARK-2331.
------------------------------
Resolution: Fixed
Fix Version/s: 2.0.0
> SparkContext.emptyRDD should return RDD[T] not EmptyRDD[T]
> ----------------------------------------------------------
>
> Key: SPARK-2331
> URL: https://issues.apache.org/jira/browse/SPARK-2331
> Project: Spark
> Issue Type: Sub-task
> Components: Spark Core
> Affects Versions: 1.0.0
> Reporter: Ian Hummel
> Assignee: Reynold Xin
> Priority: Minor
> Fix For: 2.0.0
>
>
> The return type for SparkContext.emptyRDD is EmptyRDD[T].
> It should be RDD[T]. That means you have to add extra type annotations on code like the below (which creates a union of RDDs over some subset of paths in a folder)
> {code}
> val rdds = Seq("a", "b", "c").foldLeft[RDD[String]](sc.emptyRDD[String]) { (rdd, path) ⇒
> rdd.union(sc.textFile(path))
> }
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org