You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Song Jun (JIRA)" <ji...@apache.org> on 2017/01/16 15:02:26 UTC

[jira] [Updated] (SPARK-19246) CataLogTable's partitionSchema should check if each column name in partitionColumnNames must match one and only one field in schema, and keep order with partitionColumnNames

     [ https://issues.apache.org/jira/browse/SPARK-19246?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Song Jun updated SPARK-19246:
-----------------------------
    Description: 
get CataLogTable's partitionSchema should check if each column name in partitionColumnNames must match one and only one field in schema, if not we should throw an exception

and CataLogTable's partitionSchema should keep order with partitionColumnNames 

  was:get CataLogTable's partitionSchema should check if each column name in partitionColumnNames must match one and only one field in schema, if not we should throw an exception


> CataLogTable's partitionSchema should check if each column name in partitionColumnNames must match one and only one field in schema, and keep order with partitionColumnNames 
> ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-19246
>                 URL: https://issues.apache.org/jira/browse/SPARK-19246
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Song Jun
>
> get CataLogTable's partitionSchema should check if each column name in partitionColumnNames must match one and only one field in schema, if not we should throw an exception
> and CataLogTable's partitionSchema should keep order with partitionColumnNames 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org