You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ryan Blue (JIRA)" <ji...@apache.org> on 2016/08/29 22:20:22 UTC

[jira] [Created] (SPARK-17302) Cannot set non-Spark SQL session variables in hive-site.xml, spark-defaults.conf, or using --conf

Ryan Blue created SPARK-17302:
---------------------------------

             Summary: Cannot set non-Spark SQL session variables in hive-site.xml, spark-defaults.conf, or using --conf
                 Key: SPARK-17302
                 URL: https://issues.apache.org/jira/browse/SPARK-17302
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.0.0
            Reporter: Ryan Blue


When configuration changed for 2.0 to the new SparkSession structure, Spark stopped using Hive's internal HiveConf for session state and now uses HiveSessionState and an associated SQLConf. Now, session options like hive.exec.compress.output and hive.exec.dynamic.partition.mode are pulled from this SQLConf. This doesn't include session properties from hive-site.xml (including hive.exec.compress.output), and no longer contains Spark-specific overrides from spark-defaults.conf that used the spark.hadoop.hive... pattern.

Also, setting these variables on the command-line no longer works because settings must start with "spark.".

Is there a recommended way to set Hive session properties?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org