You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Henryk Cesnolovic (JIRA)" <ji...@apache.org> on 2019/03/01 06:15:06 UTC

[jira] [Created] (SPARK-27017) Creating orc table with special symbols in column name via spark.sql

Henryk Cesnolovic created SPARK-27017:
-----------------------------------------

             Summary: Creating orc table with special symbols in column name via spark.sql
                 Key: SPARK-27017
                 URL: https://issues.apache.org/jira/browse/SPARK-27017
             Project: Spark
          Issue Type: Question
          Components: Spark Shell
    Affects Versions: 2.3.0
            Reporter: Henryk Cesnolovic


Issue is creating orc table with special symbols in column name in spark with hive support. Example:

_spark.sql("Create table abc_orc (`Column with speci@l symbo|s`string) stored as orc")_ 

throws  org.apache.spark.sql.AnalysisException: Column name "Column with speci@l symbo|s" contains invalid character(s). Please use alias to rename it.

It's interesting, because in Hive we can create such table and after that in spark we can select data from that table and it resolves schema correctly. 

My question is, is it correct behaviour of spark and if so, what is the reason of that behaviour?

  

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org