You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Henryk Cesnolovic (JIRA)" <ji...@apache.org> on 2019/04/02 06:44:00 UTC
[jira] [Comment Edited] (SPARK-27017) Creating orc table with
special symbols in column name via spark.sql
[ https://issues.apache.org/jira/browse/SPARK-27017?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16807438#comment-16807438 ]
Henryk Cesnolovic edited comment on SPARK-27017 at 4/2/19 6:43 AM:
-------------------------------------------------------------------
Hello, I have checked this one. With parquet it is resolved. Still for version 2.3.0.2.6.5.0-292 the problem exists in creating orc table with special symbols in column name. Maybe do You know from what version it is resolved for orc too?
was (Author: unxe):
Hello, I have checked this one. Still for version 2.3.0.2.6.5.0-292 the problem exists in creating orc table with special symbols in column name. With parquet it is resolved.
> Creating orc table with special symbols in column name via spark.sql
> --------------------------------------------------------------------
>
> Key: SPARK-27017
> URL: https://issues.apache.org/jira/browse/SPARK-27017
> Project: Spark
> Issue Type: Question
> Components: Spark Shell
> Affects Versions: 2.3.0
> Reporter: Henryk Cesnolovic
> Priority: Major
>
> Issue is creating orc table with special symbols in column name in spark with hive support. Example:
> _spark.sql("Create table abc_orc (`Column with speci@l symbo|s`string) stored as orc")_
> throws org.apache.spark.sql.AnalysisException: Column name "Column with speci@l symbo|s" contains invalid character(s). Please use alias to rename it.
> It's interesting, because in Hive we can create such table and after that in spark we can select data from that table and it resolves schema correctly.
> My question is, is it correct behaviour of spark and if so, what is the reason of that behaviour?
>
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org