You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:37:38 UTC

[jira] [Resolved] (SPARK-16999) Convert Keyword `path` of Data Source Tables to Lower Case

     [ https://issues.apache.org/jira/browse/SPARK-16999?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-16999.
----------------------------------
    Resolution: Incomplete

> Convert Keyword `path` of Data Source Tables to Lower Case
> ----------------------------------------------------------
>
>                 Key: SPARK-16999
>                 URL: https://issues.apache.org/jira/browse/SPARK-16999
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Xiao Li
>            Priority: Major
>              Labels: bulk-closed
>
> `path` points to the location of data source tables. The parameters' keywords are case insensitive. However, we do not always follow the rules. Below is the summary of the case sensitivity support:
> **Case sensitive**
> - AlterTableSetLocationCommand
> - AlterTableRenameCommand
> - TruncateTableCommand
> - newHadoopConfWithOptions in SessionState
> **Case insensitive**
> - ShowCreateTableCommand
> - CreateDataSourceTableCommand
> - CreateDataSourceTableAsSelectCommand
> - getBatch in FileStreamSource
> - write in DataSource
> - inferFileFormatSchema in DataSource
> - sourceSchema in DataSource
> - createSource in DataSource
> - createSink in DataSource
> - resolveRelation in DataSource
> It is not maintainable to consider the case sensitivity of `path` in all the current and future implementation. We should convert it to lower case for simplifying the handling. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org