You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2016/06/29 21:12:10 UTC
[jira] [Updated] (SPARK-16044) input_file_name() returns empty
strings in data sources based on NewHadoopRDD.
[ https://issues.apache.org/jira/browse/SPARK-16044?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Reynold Xin updated SPARK-16044:
--------------------------------
Fix Version/s: 1.6.3
> input_file_name() returns empty strings in data sources based on NewHadoopRDD.
> ------------------------------------------------------------------------------
>
> Key: SPARK-16044
> URL: https://issues.apache.org/jira/browse/SPARK-16044
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.0
> Reporter: Hyukjin Kwon
> Assignee: Hyukjin Kwon
> Fix For: 1.6.3, 2.0.0
>
>
> The issue is, {{input_file_name()}} function does not contain file paths when data sources use {{NewHadoopRDD}}. This is currently only supported for {{FileScanRDD}} and {{HadoopRDD}}.
> To be clear, this does not affect Spark's internal data sources because currently they all do not use {{NewHadoopRDD}}.
> However, there are several datasources using this. For example,
>
> spark-redshift - [here|https://github.com/databricks/spark-redshift/blob/cba5eee1ab79ae8f0fa9e668373a54d2b5babf6b/src/main/scala/com/databricks/spark/redshift/RedshiftRelation.scala#L149]
> spark-xml - [here|https://github.com/databricks/spark-xml/blob/master/src/main/scala/com/databricks/spark/xml/util/XmlFile.scala#L39-L47]
> Currently, using this functions shows the output below:
> {code}
> +-----------------+
> |input_file_name()|
> +-----------------+
> | |
> | |
> | |
> | |
> | |
> | |
> | |
> | |
> | |
> | |
> | |
> +-----------------+
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org