You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xin Wu (JIRA)" <ji...@apache.org> on 2016/08/04 20:01:20 UTC

[jira] [Commented] (SPARK-9761) Inconsistent metadata handling with ALTER TABLE

    [ https://issues.apache.org/jira/browse/SPARK-9761?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15408405#comment-15408405 ] 

Xin Wu commented on SPARK-9761:
-------------------------------

[~drwinters] Spark 2.0 has support DDL commands, which means it gives the opportunity of implementing the ALTER TABLE ADD/CHANG COLUMNS, that is not supported yet in current released Spark 2.0.  Spark 2.1 will have some change also in the native DDL infrastructure. I think once this is settled, it will be easier to support this. I am looking into this also. 

> Inconsistent metadata handling with ALTER TABLE
> -----------------------------------------------
>
>                 Key: SPARK-9761
>                 URL: https://issues.apache.org/jira/browse/SPARK-9761
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.1
>         Environment: Ubuntu on AWS
>            Reporter: Simeon Simeonov
>              Labels: hive, sql
>
> Schema changes made with {{ALTER TABLE}} are not shown in {{DESCRIBE TABLE}}. The table in question was created with {{HiveContext.read.json()}}.
> Steps:
> # {{alter table dimension_components add columns (z string);}} succeeds.
> # {{describe dimension_components;}} does not show the new column, even after restarting spark-sql.
> # A second {{alter table dimension_components add columns (z string);}} fails with RROR exec.DDLTask: org.apache.hadoop.hive.ql.metadata.HiveException: Duplicate column name: z
> Full spark-sql output [here|https://gist.github.com/ssimeonov/d9af4b8bb76b9d7befde].



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org