You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Tien-Dung LE (JIRA)" <ji...@apache.org> on 2015/07/27 17:12:06 UTC

[jira] [Commented] (SPARK-9280) New HiveContext object unexpectedly loads configuration settings from history

    [ https://issues.apache.org/jira/browse/SPARK-9280?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14642840#comment-14642840 ] 

Tien-Dung LE commented on SPARK-9280:
-------------------------------------

thanks [~pborck]. Detach the hive session state org.apache.hadoop.hive.ql.session.SessionState.detachSession() helps to avoid the issue.

> New HiveContext object unexpectedly loads configuration settings from history 
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-9280
>                 URL: https://issues.apache.org/jira/browse/SPARK-9280
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.3.1, 1.4.1
>            Reporter: Tien-Dung LE
>
> In a spark session, stopping a spark context and create a new spark context and hive context does not clean the spark sql configuration. More precisely, the new hive context still keeps the previous configuration settings. It would be great if someone can let us know how to avoid this situation.
> {code:title=New hive context should not load the configurations from history}
> sqlContext.setConf( "spark.sql.shuffle.partitions", "10")
> sc.stop
> val sparkConf2 = new org.apache.spark.SparkConf()
> val sc2 = new org.apache.spark.SparkContext( sparkConf2 ) 
> val sqlContext2 = new org.apache.spark.sql.hive.HiveContext( sc2 )
> sqlContext2.getConf( "spark.sql.shuffle.partitions", "20") 
> // got 20 as expected
> sqlContext2.setConf( "foo", "foo") 
> sqlContext2.getConf( "spark.sql.shuffle.partitions", "30")
> // expected 30 but got 10
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org