You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Chetan Khatri <ch...@gmail.com> on 2017/07/28 10:19:12 UTC

Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

Hey Dev/ USer,

I am working with Spark 2.0.1 and with dynamic partitioning with Hive
facing below issue:

org.apache.hadoop.hive.ql.metadata.HiveException:
Number of dynamic partitions created is 1344, which is more than 1000.
To solve this try to set hive.exec.max.dynamic.partitions to at least 1344.

I tried below options, but failed:

val spark = sparkSession.builder().enableHiveSupport().getOrCreate()

*spark.sqlContext.setConf("hive.exec.max.dynamic.partitions", "2000")*

Please help with alternate workaround !

Thanks

Re: Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

Posted by Chetan Khatri <ch...@gmail.com>.
I think it will be same, but let me try that

FYR - https://issues.apache.org/jira/browse/SPARK-19881

On Fri, Jul 28, 2017 at 4:44 PM, ayan guha <gu...@gmail.com> wrote:

> Try running spark.sql("set yourconf=val")
>
> On Fri, 28 Jul 2017 at 8:51 pm, Chetan Khatri <ch...@gmail.com>
> wrote:
>
>> Jorn, Both are same.
>>
>> On Fri, Jul 28, 2017 at 4:18 PM, Jörn Franke <jo...@gmail.com>
>> wrote:
>>
>>> Try sparksession.conf().set
>>>
>>> On 28. Jul 2017, at 12:19, Chetan Khatri <ch...@gmail.com>
>>> wrote:
>>>
>>> Hey Dev/ USer,
>>>
>>> I am working with Spark 2.0.1 and with dynamic partitioning with Hive
>>> facing below issue:
>>>
>>> org.apache.hadoop.hive.ql.metadata.HiveException:
>>> Number of dynamic partitions created is 1344, which is more than 1000.
>>> To solve this try to set hive.exec.max.dynamic.partitions to at least
>>> 1344.
>>>
>>> I tried below options, but failed:
>>>
>>> val spark = sparkSession.builder().enableHiveSupport().getOrCreate()
>>>
>>> *spark.sqlContext.setConf("hive.exec.max.dynamic.partitions", "2000")*
>>>
>>> Please help with alternate workaround !
>>>
>>> Thanks
>>>
>>>
>> --
> Best Regards,
> Ayan Guha
>

Re: Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

Posted by Chetan Khatri <ch...@gmail.com>.
I think it will be same, but let me try that

FYR - https://issues.apache.org/jira/browse/SPARK-19881

On Fri, Jul 28, 2017 at 4:44 PM, ayan guha <gu...@gmail.com> wrote:

> Try running spark.sql("set yourconf=val")
>
> On Fri, 28 Jul 2017 at 8:51 pm, Chetan Khatri <ch...@gmail.com>
> wrote:
>
>> Jorn, Both are same.
>>
>> On Fri, Jul 28, 2017 at 4:18 PM, Jörn Franke <jo...@gmail.com>
>> wrote:
>>
>>> Try sparksession.conf().set
>>>
>>> On 28. Jul 2017, at 12:19, Chetan Khatri <ch...@gmail.com>
>>> wrote:
>>>
>>> Hey Dev/ USer,
>>>
>>> I am working with Spark 2.0.1 and with dynamic partitioning with Hive
>>> facing below issue:
>>>
>>> org.apache.hadoop.hive.ql.metadata.HiveException:
>>> Number of dynamic partitions created is 1344, which is more than 1000.
>>> To solve this try to set hive.exec.max.dynamic.partitions to at least
>>> 1344.
>>>
>>> I tried below options, but failed:
>>>
>>> val spark = sparkSession.builder().enableHiveSupport().getOrCreate()
>>>
>>> *spark.sqlContext.setConf("hive.exec.max.dynamic.partitions", "2000")*
>>>
>>> Please help with alternate workaround !
>>>
>>> Thanks
>>>
>>>
>> --
> Best Regards,
> Ayan Guha
>

Re: Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

Posted by ayan guha <gu...@gmail.com>.
Try running spark.sql("set yourconf=val")

On Fri, 28 Jul 2017 at 8:51 pm, Chetan Khatri <ch...@gmail.com>
wrote:

> Jorn, Both are same.
>
> On Fri, Jul 28, 2017 at 4:18 PM, Jörn Franke <jo...@gmail.com> wrote:
>
>> Try sparksession.conf().set
>>
>> On 28. Jul 2017, at 12:19, Chetan Khatri <ch...@gmail.com>
>> wrote:
>>
>> Hey Dev/ USer,
>>
>> I am working with Spark 2.0.1 and with dynamic partitioning with Hive
>> facing below issue:
>>
>> org.apache.hadoop.hive.ql.metadata.HiveException:
>> Number of dynamic partitions created is 1344, which is more than 1000.
>> To solve this try to set hive.exec.max.dynamic.partitions to at least
>> 1344.
>>
>> I tried below options, but failed:
>>
>> val spark = sparkSession.builder().enableHiveSupport().getOrCreate()
>>
>> *spark.sqlContext.setConf("hive.exec.max.dynamic.partitions", "2000")*
>>
>> Please help with alternate workaround !
>>
>> Thanks
>>
>>
> --
Best Regards,
Ayan Guha

Re: Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

Posted by Chetan Khatri <ch...@gmail.com>.
Jorn, Both are same.

On Fri, Jul 28, 2017 at 4:18 PM, Jörn Franke <jo...@gmail.com> wrote:

> Try sparksession.conf().set
>
> On 28. Jul 2017, at 12:19, Chetan Khatri <ch...@gmail.com>
> wrote:
>
> Hey Dev/ USer,
>
> I am working with Spark 2.0.1 and with dynamic partitioning with Hive
> facing below issue:
>
> org.apache.hadoop.hive.ql.metadata.HiveException:
> Number of dynamic partitions created is 1344, which is more than 1000.
> To solve this try to set hive.exec.max.dynamic.partitions to at least
> 1344.
>
> I tried below options, but failed:
>
> val spark = sparkSession.builder().enableHiveSupport().getOrCreate()
>
> *spark.sqlContext.setConf("hive.exec.max.dynamic.partitions", "2000")*
>
> Please help with alternate workaround !
>
> Thanks
>
>

Re: Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

Posted by Chetan Khatri <ch...@gmail.com>.
Jorn, Both are same.

On Fri, Jul 28, 2017 at 4:18 PM, Jörn Franke <jo...@gmail.com> wrote:

> Try sparksession.conf().set
>
> On 28. Jul 2017, at 12:19, Chetan Khatri <ch...@gmail.com>
> wrote:
>
> Hey Dev/ USer,
>
> I am working with Spark 2.0.1 and with dynamic partitioning with Hive
> facing below issue:
>
> org.apache.hadoop.hive.ql.metadata.HiveException:
> Number of dynamic partitions created is 1344, which is more than 1000.
> To solve this try to set hive.exec.max.dynamic.partitions to at least
> 1344.
>
> I tried below options, but failed:
>
> val spark = sparkSession.builder().enableHiveSupport().getOrCreate()
>
> *spark.sqlContext.setConf("hive.exec.max.dynamic.partitions", "2000")*
>
> Please help with alternate workaround !
>
> Thanks
>
>

Re: Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

Posted by Jörn Franke <jo...@gmail.com>.
Try sparksession.conf().set

> On 28. Jul 2017, at 12:19, Chetan Khatri <ch...@gmail.com> wrote:
> 
> Hey Dev/ USer,
> 
> I am working with Spark 2.0.1 and with dynamic partitioning with Hive facing below issue:
> 
> org.apache.hadoop.hive.ql.metadata.HiveException:
> Number of dynamic partitions created is 1344, which is more than 1000.
> To solve this try to set hive.exec.max.dynamic.partitions to at least 1344.
> 
> I tried below options, but failed:
> 
> val spark = sparkSession.builder().enableHiveSupport().getOrCreate()
> 
> spark.sqlContext.setConf("hive.exec.max.dynamic.partitions", "2000")
> 
> Please help with alternate workaround !
> 
> Thanks

Re: Support Dynamic Partition Inserts params with SET command in Spark 2.0.1

Posted by Jörn Franke <jo...@gmail.com>.
Try sparksession.conf().set

> On 28. Jul 2017, at 12:19, Chetan Khatri <ch...@gmail.com> wrote:
> 
> Hey Dev/ USer,
> 
> I am working with Spark 2.0.1 and with dynamic partitioning with Hive facing below issue:
> 
> org.apache.hadoop.hive.ql.metadata.HiveException:
> Number of dynamic partitions created is 1344, which is more than 1000.
> To solve this try to set hive.exec.max.dynamic.partitions to at least 1344.
> 
> I tried below options, but failed:
> 
> val spark = sparkSession.builder().enableHiveSupport().getOrCreate()
> 
> spark.sqlContext.setConf("hive.exec.max.dynamic.partitions", "2000")
> 
> Please help with alternate workaround !
> 
> Thanks