You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by SRK <sw...@gmail.com> on 2017/02/27 15:30:35 UTC
How to set hive configs in Spark 2.1?
Hi,
How to set the hive configurations in Spark 2.1? I have the following in
1.6. How to set the configs related to hive using the new SparkSession?
sqlContext.sql(s"use ${HIVE_DB_NAME} ")
sqlContext.setConf("hive.exec.dynamic.partition", "true")
sqlContext.setConf("hive.exec.dynamic.partition.mode", "nonstrict")
sqlContext.setConf("hive.exec.max.dynamic.partitions.pernode", "100000")
sqlContext.setConf("hive.exec.max.dynamic.partitions", "100000")
sqlContext.setConf("hive.scratch.dir.permission", "777")
sqlContext.setConf("spark.sql.orc.filterPushdown", "true")
sqlContext.setConf("spark.sql.shuffle.partitions", "2000")
sqlContext.setConf("hive.default.fileformat", "Orc")
sqlContext.setConf("hive.exec.orc.memory.pool", "1.0")
sqlContext.setConf("hive.optimize.sort.dynamic.partition", "true")
sqlContext.setConf("hive.exec.reducers.max", "2000")
sqlContext.setConf("spark.sql.orc.filterPushdown", "true")
sqlContext.sql("set hive.default.fileformat=Orc ")
sqlContext.sql("set hive.enforce.bucketing = true; ")
sqlContext.sql("set hive.enforce.sorting = true; ")
sqlContext.sql("set hive.auto.convert.join = true; ")
sqlContext.sql("set hive.optimize.bucketmapjoin = true; ")
sqlContext.sql("set hive.optimize.insert.dest.volume=true;")
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-hive-configs-in-Spark-2-1-tp28429.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org
Re: How to set hive configs in Spark 2.1?
Posted by swetha kasireddy <sw...@gmail.com>.
Even the hive configurations like the following would work with this?
sqlContext.setConf("hive.default.fileformat", "Orc")
sqlContext.setConf("hive.exec.orc.memory.pool", "1.0")
sqlContext.setConf("hive.optimize.sort.dynamic.partition", "true")
sqlContext.setConf("hive.exec.reducers.max", "2000")
On Mon, Feb 27, 2017 at 9:26 AM, neil90 <ne...@icloud.com> wrote:
> All you need to do is -
>
> spark.conf.set("spark.sql.shuffle.partitions", 2000)
> spark.conf.set("spark.sql.orc.filterPushdown", True)
> ...etc
>
>
>
> --
> View this message in context: http://apache-spark-user-list.
> 1001560.n3.nabble.com/How-to-set-hive-configs-in-Spark-2-1-
> tp28429p28431.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>
Re: How to set hive configs in Spark 2.1?
Posted by neil90 <ne...@icloud.com>.
All you need to do is -
spark.conf.set("spark.sql.shuffle.partitions", 2000)
spark.conf.set("spark.sql.orc.filterPushdown", True)
...etc
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-set-hive-configs-in-Spark-2-1-tp28429p28431.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org