You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/12/24 12:19:49 UTC
[jira] [Assigned] (SPARK-12517) No default RDD name for ones
created by sc.textFile
[ https://issues.apache.org/jira/browse/SPARK-12517?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-12517:
------------------------------------
Assignee: Apache Spark
> No default RDD name for ones created by sc.textFile
> ----------------------------------------------------
>
> Key: SPARK-12517
> URL: https://issues.apache.org/jira/browse/SPARK-12517
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.4.0, 1.4.1, 1.5.0, 1.5.2
> Reporter: yaron weinsberg
> Assignee: Apache Spark
> Priority: Minor
> Labels: easyfix
> Fix For: 1.3.1
>
> Original Estimate: 24h
> Remaining Estimate: 24h
>
> Having a default name for an RDD created from a file is very handy.
> The feature was first added at commit: 7b877b2 but was later removed (probably by mistake) at commit: fc8b581.
> This change sets the default path of RDDs created via sc.textFile(...) to the path argument.
> Here is the symptom:
> Using spark-1.5.2-bin-hadoop2.6:
> scala> sc.textFile("/home/root/.bashrc").name
> res5: String = null
> scala> sc.binaryFiles("/home/root/.bashrc").name
> res6: String = /home/root/.bashrc
> while using Spark 1.3.1:
> scala> sc.textFile("/home/root/.bashrc").name
> res0: String = /home/root/.bashrc
> scala> sc.binaryFiles("/home/root/.bashrc").name
> res1: String = /home/root/.bashrc
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org