You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@toree.apache.org by Harmeet Singh <t-...@microsoft.com> on 2016/04/11 13:04:21 UTC

Help needed to specify configuration of spark-kernel at runtime

Hi,

I am running toree using the command "dist/toree/bin/run.sh". This starts the spark-kernel in default configuration mode. However, I want to specify the parameters for spark-kernel. I want to run spark-kernel using following parameters:

--num-executors 25 --executor-cores 4 --executor-memory 10g --driver-memory 10g --conf spark.yarn.executor.memoryOverhead=2048 --conf spark.driver.maxResultSize=10g --conf spark.serializer=org.apache.spark.serializer.KryoSerializer

If possible, please help me.

Regards,
Harmeet

Re: Help needed to specify configuration of spark-kernel at runtime

Posted by Gino Bustelo <gi...@bustelos.com>.
lbustelo 08:51
@HS88 By default, Toree will create a SparkContext used a SparkConf that is
populate through cmd parameters passed using SPARK_CONF. There are other
forms of startup time configuration that are covered by
http://spark.apache.org/docs/latest/submitting-applications.html.
If you wish to configure the SparkContext at runtime, then you can set
Toree to not create a context for you and pass in the --nosparkcontext. At
this point, creating, configuring and managing the SparkContext is up to
you and done with code on a Notebook cell.

chipsenkbeil 09:18
@HS88 note that using --nosparkcontext means that you will need to create
the spark context using kernel.createSparkContext(sparkConf) or
kernel.createSparkContext(master, appName)

On Mon, Apr 11, 2016 at 6:04 AM, Harmeet Singh <t-...@microsoft.com>
wrote:

> Hi,
>
> I am running toree using the command "dist/toree/bin/run.sh". This starts
> the spark-kernel in default configuration mode. However, I want to specify
> the parameters for spark-kernel. I want to run spark-kernel using following
> parameters:
>
> --num-executors 25 --executor-cores 4 --executor-memory 10g
> --driver-memory 10g --conf spark.yarn.executor.memoryOverhead=2048 --conf
> spark.driver.maxResultSize=10g --conf
> spark.serializer=org.apache.spark.serializer.KryoSerializer
>
> If possible, please help me.
>
> Regards,
> Harmeet
>