You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Tathagata Das (JIRA)" <ji...@apache.org> on 2016/06/17 02:25:05 UTC
[jira] [Updated] (SPARK-16006) Empty DataFrame created with
spark.read.text() cannot be written as it has no fields
[ https://issues.apache.org/jira/browse/SPARK-16006?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Tathagata Das updated SPARK-16006:
----------------------------------
Summary: Empty DataFrame created with spark.read.text() cannot be written as it has no fields (was: Empty DataFrame with spark.read.text() cannot be written as it has no fields)
> Empty DataFrame created with spark.read.text() cannot be written as it has no fields
> ------------------------------------------------------------------------------------
>
> Key: SPARK-16006
> URL: https://issues.apache.org/jira/browse/SPARK-16006
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Reporter: Tathagata Das
>
> Attempting to write an emptyDataFrame created with {{spark.read.text().write.text("p")}} fails with the following exception
> {code}
> [info] - text API *** FAILED *** (218 milliseconds)
> [info] org.apache.spark.sql.AnalysisException: Cannot use all columns for partition columns;
> [info] at org.apache.spark.sql.execution.datasources.PartitioningUtils$.validatePartitionColumn(PartitioningUtils.scala:355)
> [info] at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:432)
> [info] at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:213)
> [info] at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:196)
> [info] at org.apache.spark.sql.DataFrameWriter.text(DataFrameWriter.scala:525)
> {code}
> This is because # fields == # partitioning columns = 0 at org.apache.spark.sql.execution.datasources.PartitioningUtils$.validatePartitionColumn(PartitioningUtils.scala:355)
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org