You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/09/14 16:24:00 UTC
[jira] [Assigned] (SPARK-21997) Spark shows different results on
Hive char/varchar columns on Parquet
[ https://issues.apache.org/jira/browse/SPARK-21997?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-21997:
------------------------------------
Assignee: Apache Spark
> Spark shows different results on Hive char/varchar columns on Parquet
> ---------------------------------------------------------------------
>
> Key: SPARK-21997
> URL: https://issues.apache.org/jira/browse/SPARK-21997
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.2, 2.1.1, 2.2.0
> Reporter: Dongjoon Hyun
> Assignee: Apache Spark
>
> SPARK-19459 resolves CHAR/VARCHAR issues in general, but Spark shows different results according to the SQL configuration, *spark.sql.hive.convertMetastoreParquet*. We had better fix this. Actually, the default of `spark.sql.hive.convertMetastoreParquet` is true, so the result is wrong by default.
> For ORC, the default of `spark.sql.hive.convertMetastoreOrc` is false, so SPARK-19459 didn't resolve this together. For ORC, it will happen if we turn on it `true`.
> {code}
> hive> CREATE TABLE t_char(a CHAR(10), b VARCHAR(10)) STORED AS parquet;
> hive> INSERT INTO TABLE t_char SELECT 'a', 'b' FROM (SELECT 1) t;
> scala> sql("SELECT * FROM t_char").show
> +---+---+
> | a| b|
> +---+---+
> | a| b|
> +---+---+
> scala> sql("set spark.sql.hive.convertMetastoreParquet=false")
> scala> sql("SELECT * FROM t_char").show
> +----------+---+
> | a| b|
> +----------+---+
> |a | b|
> +----------+---+
> scala> spark.version
> res3: String = 2.2.0
> {code}
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org