You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:03:08 UTC

[jira] [Updated] (SPARK-17302) Cannot set non-Spark SQL session variables in hive-site.xml, spark-defaults.conf, or using --conf

     [ https://issues.apache.org/jira/browse/SPARK-17302?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-17302:
---------------------------------
    Labels: bulk-closed  (was: )

> Cannot set non-Spark SQL session variables in hive-site.xml, spark-defaults.conf, or using --conf
> -------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-17302
>                 URL: https://issues.apache.org/jira/browse/SPARK-17302
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Ryan Blue
>            Priority: Major
>              Labels: bulk-closed
>
> When configuration changed for 2.0 to the new SparkSession structure, Spark stopped using Hive's internal HiveConf for session state and now uses HiveSessionState and an associated SQLConf. Now, session options like hive.exec.compress.output and hive.exec.dynamic.partition.mode are pulled from this SQLConf. This doesn't include session properties from hive-site.xml (including hive.exec.compress.output), and no longer contains Spark-specific overrides from spark-defaults.conf that used the spark.hadoop.hive... pattern.
> Also, setting these variables on the command-line no longer works because settings must start with "spark.".
> Is there a recommended way to set Hive session properties?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org