You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Divya Gehlot <di...@gmail.com> on 2016/02/15 03:36:15 UTC

IllegalStateException : When use --executor-cores option in YARN

Hi,

I am starting spark-shell with following options :
spark-shell --properties-file  /TestDivya/Spark/Oracle.properties --jars
/usr/hdp/2.3.4.0-3485/spark/lib/ojdbc6.jar --driver-class-path
/usr/hdp/2.3.4.0-3485/spark/lib/ojdbc6.jar --packages
com.databricks:spark-csv_2.10:1.1.0  --master yarn-client --num-executors
10 --executor-cores 4 -i /TestDivya/Spark/Test.scala

Got few queries :
1.Error :
java.lang.IllegalStateException: SparkContext has been shutdown

If I remove --executor-cores 4 .. It runs smoothly

2. with --num-executors 10 my spark job takes more time .
 May I know why ?

3. Whats the difference between spark-shell and spark-submit

I am new bee to Spark ..Apologies for such naive questions.
Just  trying to figure out how to tune spark jobs to increase performance
on Hadoop cluster on EC2.
If anybody has real time experience ,please help me.


Thanks,
Divya

Re: IllegalStateException : When use --executor-cores option in YARN

Posted by Saisai Shao <sa...@gmail.com>.
Hi Divya,

Would you please provide full stack of exception? From my understanding
--executor-cores should be worked, we could know better if you provide the
full stack trace.

The performance relies on many different aspects, I'd recommend you to
check the spark web UI to know the application runtime better.

Spark shell is a cmdline Spark application for you to interactively execute
spark jobs, whereas Spark-submit is used to submit your own spark
applications (
http://spark.apache.org/docs/latest/submitting-applications.html).

Thanks
Saisai

On Mon, Feb 15, 2016 at 10:36 AM, Divya Gehlot <di...@gmail.com>
wrote:

> Hi,
>
> I am starting spark-shell with following options :
> spark-shell --properties-file  /TestDivya/Spark/Oracle.properties --jars
> /usr/hdp/2.3.4.0-3485/spark/lib/ojdbc6.jar --driver-class-path
> /usr/hdp/2.3.4.0-3485/spark/lib/ojdbc6.jar --packages
> com.databricks:spark-csv_2.10:1.1.0  --master yarn-client --num-executors
> 10 --executor-cores 4 -i /TestDivya/Spark/Test.scala
>
> Got few queries :
> 1.Error :
> java.lang.IllegalStateException: SparkContext has been shutdown
>
> If I remove --executor-cores 4 .. It runs smoothly
>
> 2. with --num-executors 10 my spark job takes more time .
>  May I know why ?
>
> 3. Whats the difference between spark-shell and spark-submit
>
> I am new bee to Spark ..Apologies for such naive questions.
> Just  trying to figure out how to tune spark jobs to increase performance
> on Hadoop cluster on EC2.
> If anybody has real time experience ,please help me.
>
>
> Thanks,
> Divya
>