You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2019/10/08 05:42:19 UTC
[jira] [Resolved] (SPARK-23837) Create table as select gives
exception if the spark generated alias name contains comma
[ https://issues.apache.org/jira/browse/SPARK-23837?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-23837.
----------------------------------
Resolution: Incomplete
> Create table as select gives exception if the spark generated alias name contains comma
> ---------------------------------------------------------------------------------------
>
> Key: SPARK-23837
> URL: https://issues.apache.org/jira/browse/SPARK-23837
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.2.1, 2.3.0
> Reporter: shahid
> Priority: Minor
> Labels: bulk-closed
>
> For spark generated alias name contains comma, Hive metastore throws exception.
>
> 0: jdbc:hive2://ha-cluster/default> create table a (col1 decimal(18,3), col2 decimal(18,5));
> +-----------++
> |Result|
> +-----------++
> +-----------++
> No rows selected (0.171 seconds)
> 0: jdbc:hive2://ha-cluster/default> select col1*col2 from a;
> +-------------------------------------------------------------------------------------------+
> |(CAST(col1 AS DECIMAL(20,5)) * CAST(col2 AS DECIMAL(20,5)))|
> +-------------------------------------------------------------------------------------------+
> +-------------------------------------------------------------------------------------------+
> No rows selected (0.168 seconds)
> 0: jdbc:hive2://ha-cluster/default> create table b as select col1*col2 from a;
> Error: org.apache.spark.sql.AnalysisException: Cannot create a table having a column whose name contains commas in Hive metastore. Table: `default`.`b`; Column: (CAST(col1 AS DECIMAL(20,5)) * CAST(col2 AS DECIMAL(20,5))); (state=,code=0)
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org