You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/07/14 09:48:20 UTC

[jira] [Commented] (SPARK-16543) Rename the columns of `SHOW PARTITION/COLUMNS` commands

    [ https://issues.apache.org/jira/browse/SPARK-16543?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15376685#comment-15376685 ] 

Apache Spark commented on SPARK-16543:
--------------------------------------

User 'dongjoon-hyun' has created a pull request for this issue:
https://github.com/apache/spark/pull/14199

> Rename the columns of `SHOW PARTITION/COLUMNS` commands
> -------------------------------------------------------
>
>                 Key: SPARK-16543
>                 URL: https://issues.apache.org/jira/browse/SPARK-16543
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: Dongjoon Hyun
>            Priority: Minor
>
> This issue changes the name of columns returned by `SHOW PARTITION/COLUMNS` commands. Currently, both commands uses `result` as a column name.
> **Comparison**
> ||Command||Spark(Before)||Spark(After)||Hive||
> |SHOW PARTITIONS|result|partition|partition|
> |SHOW COLUMNS|result|col_name|field|
> Note that Spark/Hive show 'col_name` in `DESC TABLES`. So, after this issue, Spark uses `col_name` consistently among commands.
> **Before**
> {code}
> scala> sql("show partitions p").show()
> +------+
> |result|
> +------+
> |   b=2|
> +------+
> scala> sql("show columns in p").show()
> +------+
> |result|
> +------+
> |     a|
> |     b|
> +------+
> {code}
> **After**
> {code}
> scala> sql("show partitions p").show
> +---------+
> |partition|
> +---------+
> |      b=2|
> +---------+
> scala> sql("show columns in p").show
> +--------+
> |col_name|
> +--------+
> |       a|
> |       b|
> +--------+
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org