You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Rok Roskar <ro...@gmail.com> on 2015/10/23 12:58:08 UTC

setting memory usage for spark

I'm trying to control the amount of memory used by Spark driver and
executors -- in a normal setting, I would specify these values via the
spark.driver.memory and spark.executor.memory configuration options.
however, setting these in the interpreter properties in zeppelin doesn't
seem to do anything -- because zeppelin also sets its own extra java flags,
I couldn't simply set the -Xmx flag in spark.driver.extraJavaOptions -- the
only alternative I could think of is to set the _JAVA_OPTIONS environment
variable, which did the trick, but is not the optimal way I would want to
use for changing the memory settings. What am I missing here? How should
this be done? Thanks,

Rok

Re: setting memory usage for spark

Posted by Felix Cheung <fe...@hotmail.com>.
And if you set SPARK_HOME, Zeppelin will launch Spark with spark-submit and you can customize by setting SPARK_SUBMIT_OPTIONS in the environment. Check our spark-submit for options.





On Sun, Oct 25, 2015 at 6:15 AM -0700, "Jongyoul Lee" <jo...@gmail.com> wrote:
Hi,

Zeppelin runs Spark with two different way. The first, you can use external
spark cluster if you already have one. In this case, you can set SPARK_HOME
in the conf/zeppelin-env.sh. The second, without SPARK_HOME, Zeppelin
launch Spark internally. If you use this mode, you can set these value in
your interpreter tab in your browser and also can set
ZEPPELIN_INT_JAVA_OPTS like '-Dspark.driver.memory=4G'.

Hope this help.

JL.

On Sat, Oct 24, 2015 at 1:47 AM, David Salinas <da...@gmail.com>
wrote:

> Same problem here.
>
> David
>
> On Fri, Oct 23, 2015 at 12:58 PM, Rok Roskar <ro...@gmail.com> wrote:
>
>> I'm trying to control the amount of memory used by Spark driver and
>> executors -- in a normal setting, I would specify these values via the
>> spark.driver.memory and spark.executor.memory configuration options.
>> however, setting these in the interpreter properties in zeppelin doesn't
>> seem to do anything -- because zeppelin also sets its own extra java flags,
>> I couldn't simply set the -Xmx flag in spark.driver.extraJavaOptions --
>> the only alternative I could think of is to set the _JAVA_OPTIONS
>> environment variable, which did the trick, but is not the optimal way I
>> would want to use for changing the memory settings. What am I missing here?
>> How should this be done? Thanks,
>>
>> Rok
>>
>
>


--
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Re: setting memory usage for spark

Posted by Jongyoul Lee <jo...@gmail.com>.
Hi,

Zeppelin runs Spark with two different way. The first, you can use external
spark cluster if you already have one. In this case, you can set SPARK_HOME
in the conf/zeppelin-env.sh. The second, without SPARK_HOME, Zeppelin
launch Spark internally. If you use this mode, you can set these value in
your interpreter tab in your browser and also can set
ZEPPELIN_INT_JAVA_OPTS like '-Dspark.driver.memory=4G'.

Hope this help.

JL.

On Sat, Oct 24, 2015 at 1:47 AM, David Salinas <da...@gmail.com>
wrote:

> Same problem here.
>
> David
>
> On Fri, Oct 23, 2015 at 12:58 PM, Rok Roskar <ro...@gmail.com> wrote:
>
>> I'm trying to control the amount of memory used by Spark driver and
>> executors -- in a normal setting, I would specify these values via the
>> spark.driver.memory and spark.executor.memory configuration options.
>> however, setting these in the interpreter properties in zeppelin doesn't
>> seem to do anything -- because zeppelin also sets its own extra java flags,
>> I couldn't simply set the -Xmx flag in spark.driver.extraJavaOptions --
>> the only alternative I could think of is to set the _JAVA_OPTIONS
>> environment variable, which did the trick, but is not the optimal way I
>> would want to use for changing the memory settings. What am I missing here?
>> How should this be done? Thanks,
>>
>> Rok
>>
>
>


-- 
이종열, Jongyoul Lee, 李宗烈
http://madeng.net

Re: setting memory usage for spark

Posted by David Salinas <da...@gmail.com>.
Same problem here.

David

On Fri, Oct 23, 2015 at 12:58 PM, Rok Roskar <ro...@gmail.com> wrote:

> I'm trying to control the amount of memory used by Spark driver and
> executors -- in a normal setting, I would specify these values via the
> spark.driver.memory and spark.executor.memory configuration options.
> however, setting these in the interpreter properties in zeppelin doesn't
> seem to do anything -- because zeppelin also sets its own extra java flags,
> I couldn't simply set the -Xmx flag in spark.driver.extraJavaOptions --
> the only alternative I could think of is to set the _JAVA_OPTIONS
> environment variable, which did the trick, but is not the optimal way I
> would want to use for changing the memory settings. What am I missing here?
> How should this be done? Thanks,
>
> Rok
>