You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yin Huai (JIRA)" <ji...@apache.org> on 2016/08/05 22:53:20 UTC

[jira] [Resolved] (SPARK-16901) Hive settings in hive-site.xml may be overridden by Hive's default values

     [ https://issues.apache.org/jira/browse/SPARK-16901?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Yin Huai resolved SPARK-16901.
------------------------------
       Resolution: Fixed
    Fix Version/s: 2.1.0
                   2.0.1

Issue resolved by pull request 14497
[https://github.com/apache/spark/pull/14497]

> Hive settings in hive-site.xml may be overridden by Hive's default values
> -------------------------------------------------------------------------
>
>                 Key: SPARK-16901
>                 URL: https://issues.apache.org/jira/browse/SPARK-16901
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: Yin Huai
>            Assignee: Yin Huai
>             Fix For: 2.0.1, 2.1.0
>
>
> When we create the HiveConf for metastore client, we use a Hadoop Conf as the base, which may contain Hive settings in hive-site.xml (https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/internal/SharedState.scala#L49). However, HiveConf's initialize function basically ignores the base Hadoop Conf and always its default values (i.e. settings with non-null default values) as the base (https://github.com/apache/hive/blob/release-1.2.1/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java#L2687). So, even a user put {{javax.jdo.option.ConnectionURL}} in hive-site.xml, it is not used and Hive will use its default, which is {{jdbc:derby:;databaseName=metastore_db;create=true}}.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org