You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "liuxian (JIRA)" <ji...@apache.org> on 2019/03/22 03:22:00 UTC
[jira] [Updated] (SPARK-27238) In the same APP, maybe some hive
parquet tables can't use the built-in Parquet reader and writer
[ https://issues.apache.org/jira/browse/SPARK-27238?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
liuxian updated SPARK-27238:
----------------------------
Description:
In the same APP, TableA and TableB are both hive parquet tables, but TableA can't use the built-in Parquet reader and writer.
{color:#6a8759}In {color}{color:#333333}this situation, {color}_spark.sql.hive.convertMetastoreParquet_ {color:#333333}can't control this well, so I think we can add a fine-grained configuration to handle this case{color}
was:
In the same APP, TableA and TableB are both hive parquet tables, but TableA can't use the built-in Parquet reader and writer.
{color:#6a8759}{color:#333333}In {color}{color:#333333}this situation, {color}{color:#ffffff}{color}spark.sql.hive.convertMetastoreParquet {color:#333333}can't control this well, so I think we can add a fine-grained configuration to handle this case{color}
{color}
> In the same APP, maybe some hive parquet tables can't use the built-in Parquet reader and writer
> ------------------------------------------------------------------------------------------------
>
> Key: SPARK-27238
> URL: https://issues.apache.org/jira/browse/SPARK-27238
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: liuxian
> Priority: Minor
>
> In the same APP, TableA and TableB are both hive parquet tables, but TableA can't use the built-in Parquet reader and writer.
> {color:#6a8759}In {color}{color:#333333}this situation, {color}_spark.sql.hive.convertMetastoreParquet_ {color:#333333}can't control this well, so I think we can add a fine-grained configuration to handle this case{color}
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org