You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Jongyoul Lee <jo...@gmail.com> on 2015/07/05 15:44:34 UTC

Re: Spark Context time out on Yarn cluster

Hi,

My yarn cluster also sets a setting about dynamic allocation, and I've
tested that setting, too. I, however, don't make sure that setting works
correctly. Did you already test dynamic allocation? If it's OK, please
share your zeppelin-env.sh and interpreter setting.

Regards,
Jongyoul Lee

On Fri, Jun 19, 2015 at 8:45 AM, Sambit Tripathy (RBEI/EDS1) <
Sambit.Tripathy@in.bosch.com> wrote:

>  Hi,
>
> Recently dynamic allocation feature of YARN has been enabled on our
> cluster due to increase in workload. At the same time I upgraded Zeppelin
> to work with Spark 1.3.1.
>
> Now the spark context that is created in the notebook is short lived.
> Every time I run some command it throws me an error saying, spark context
> has been stopped.
>
> Do I have to provide some configurations in zeppelin-env.sh or interpreter
> settings to work with YARN dynamic allocation?
>
>
>
> Regards,
> Sambit.
>
>



-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

RE: Spark Context time out on Yarn cluster

Posted by "Sambit Tripathy (RBEI/EDS1)" <Sa...@in.bosch.com>.
Hi,

Sorry for the late response. Actually I got rid of the error after setting the following fields. It could be a cluster specific issue is what I suspect, nothing from Zeppelin side.

Thanks for the revert.

spark.dynamicAllocation.initialExecutors

45

spark.dynamicAllocation.maxExecutors

60

spark.dynamicAllocation.minExecutors

5

spark.dynamicAllocation.schedulerBacklogTimeout

600

spark.dynamicAllocation.sustainedSchedulerBacklogTimeout

600



From: Jongyoul Lee [mailto:jongyoul@gmail.com]
Sent: Sunday, July 05, 2015 6:45 AM
To: users@zeppelin.incubator.apache.org
Subject: Re: Spark Context time out on Yarn cluster

Hi,

My yarn cluster also sets a setting about dynamic allocation, and I've tested that setting, too. I, however, don't make sure that setting works correctly. Did you already test dynamic allocation? If it's OK, please share your zeppelin-env.sh and interpreter setting.

Regards,
Jongyoul Lee

On Fri, Jun 19, 2015 at 8:45 AM, Sambit Tripathy (RBEI/EDS1) <Sa...@in.bosch.com>> wrote:
Hi,

Recently dynamic allocation feature of YARN has been enabled on our cluster due to increase in workload. At the same time I upgraded Zeppelin to work with Spark 1.3.1.

Now the spark context that is created in the notebook is short lived. Every time I run some command it throws me an error saying, spark context has been stopped.

Do I have to provide some configurations in zeppelin-env.sh or interpreter settings to work with YARN dynamic allocation?



Regards,
Sambit.




--
이종열, Jongyoul Lee, 李宗烈
http://madeng.net