You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Zhong (JIRA)" <ji...@apache.org> on 2016/09/05 21:32:21 UTC

[jira] [Commented] (SPARK-17302) Cannot set non-Spark SQL session variables in hive-site.xml, spark-defaults.conf, or using --conf

    [ https://issues.apache.org/jira/browse/SPARK-17302?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15465818#comment-15465818 ] 

Sean Zhong commented on SPARK-17302:
------------------------------------

[~rdblue] Can you write a reproducer code sample to describe your problem? To elaborate what's your expected behavior, and what is your current observation?

I am not sure whether the {monospace}spark.conf.set("key", "value"){monospace} is what you expect or not.

> Cannot set non-Spark SQL session variables in hive-site.xml, spark-defaults.conf, or using --conf
> -------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-17302
>                 URL: https://issues.apache.org/jira/browse/SPARK-17302
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Ryan Blue
>
> When configuration changed for 2.0 to the new SparkSession structure, Spark stopped using Hive's internal HiveConf for session state and now uses HiveSessionState and an associated SQLConf. Now, session options like hive.exec.compress.output and hive.exec.dynamic.partition.mode are pulled from this SQLConf. This doesn't include session properties from hive-site.xml (including hive.exec.compress.output), and no longer contains Spark-specific overrides from spark-defaults.conf that used the spark.hadoop.hive... pattern.
> Also, setting these variables on the command-line no longer works because settings must start with "spark.".
> Is there a recommended way to set Hive session properties?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org