You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "angerszhu (Jira)" <ji...@apache.org> on 2020/09/08 07:32:00 UTC

[jira] [Updated] (SPARK-32818) Make hive metastore convert config can be changed session level

     [ https://issues.apache.org/jira/browse/SPARK-32818?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

angerszhu updated SPARK-32818:
------------------------------
    Description: 
There are some cases that we have history hive  table, in origin, it use textfile serde, later  it's change to orc serde, since during convert, it will use table level serde to scan all partition, if some old partition file format is not same , will failed to convert , we need to close metastoer convert and restart spark program.

It's bad for adhoc engine such as long running spark thrift server. so I think it's necessary to make these two config can be changed for use in session level

> Make hive metastore convert config  can be changed session level
> ----------------------------------------------------------------
>
>                 Key: SPARK-32818
>                 URL: https://issues.apache.org/jira/browse/SPARK-32818
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: angerszhu
>            Priority: Major
>
> There are some cases that we have history hive  table, in origin, it use textfile serde, later  it's change to orc serde, since during convert, it will use table level serde to scan all partition, if some old partition file format is not same , will failed to convert , we need to close metastoer convert and restart spark program.
> It's bad for adhoc engine such as long running spark thrift server. so I think it's necessary to make these two config can be changed for use in session level



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org