You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2017/09/07 05:22:00 UTC

[jira] [Resolved] (SPARK-21912) ORC/Parquet table should not create invalid column names

     [ https://issues.apache.org/jira/browse/SPARK-21912?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xiao Li resolved SPARK-21912.
-----------------------------
       Resolution: Fixed
         Assignee: Dongjoon Hyun
    Fix Version/s: 2.3.0

> ORC/Parquet table should not create invalid column names
> --------------------------------------------------------
>
>                 Key: SPARK-21912
>                 URL: https://issues.apache.org/jira/browse/SPARK-21912
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.2.0
>            Reporter: Dongjoon Hyun
>            Assignee: Dongjoon Hyun
>             Fix For: 2.3.0
>
>
> Currently, users meet job abortions while creating ORC data source tables with invalid column names. We had better prevent this by raising AnalysisException like Paquet data source tables.
> {code}
> scala> sql("CREATE TABLE orc1 USING ORC AS SELECT 1 `a b`")
> 17/09/04 13:28:21 ERROR Utils: Aborting task
> java.lang.IllegalArgumentException: Error: : expected at the position 8 of 'struct<a b:int>' but ' ' is found.
> 	at org.apache.hadoop.hive.serde2.typeinfo.TypeInfoUtils$TypeInfoParser.expect(TypeInfoUtils.java:360)
> ...
> 17/09/04 13:28:21 WARN FileOutputCommitter: Could not delete file:/Users/dongjoon/spark-release/spark-master/spark-warehouse/orc1/_temporary/0/_temporary/attempt_20170904132821_0001_m_000000_0
> 17/09/04 13:28:21 ERROR FileFormatWriter: Job job_20170904132821_0001 aborted.
> 17/09/04 13:28:21 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 1)
> org.apache.spark.SparkException: Task failed while writing rows.
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org