You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Samya <sa...@amadeus.com> on 2015/08/26 10:47:42 UTC

Relation between threads and executor core

Hi All,

Few basic queries :-
1. Is there a way we can control the number of threads per executor core?
2. Does this parameter “executor-cores” also has say in deciding how many
threads to be run?

Regards,
Sam



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Relation-between-threads-and-executor-core-tp24456.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Relation between threads and executor core

Posted by Jem Tucker <je...@gmail.com>.
Sam,

This may be of interest, as far as i can see it suggests that a spark
'task' is always executed as a single thread in the JVM.

http://0x0fff.com/spark-architecture/

Thanks,

Jem



On Wed, Aug 26, 2015 at 10:06 AM Samya MAITI <Sa...@amadeus.com>
wrote:

> Thanks Jem, I do understand your suggestion. Actually --executor-cores
> alone doesn’t control the number of tasks, but is also governed by
> *spark.task.cpus* (amount of cores dedicated for each task’s execution).
>
>
>
> Reframing my Question*, How many threads can be spawned per executor
> core? Is it in user control? *
>
>
>
> Regards,
>
> Sam
>
>
>
> *From:* Jem Tucker [mailto:jem.tucker@gmail.com]
> *Sent:* Wednesday, August 26, 2015 2:26 PM
> *To:* Samya MAITI <Sa...@amadeus.com>; user@spark.apache.org
> *Subject:* Re: Relation between threads and executor core
>
>
>
> Hi Samya,
>
>
>
> When submitting an application with spark-submit the cores per executor
> can be set with --executor-cores, meaning you can run that many tasks per
> executor concurrently. The page below has some more details on submitting
> applications:
>
>
>
> https://spark.apache.org/docs/latest/submitting-applications.html
>
>
>
> thanks,
>
>
>
> Jem
>
>
>
> On Wed, Aug 26, 2015 at 9:47 AM Samya <sa...@amadeus.com> wrote:
>
> Hi All,
>
> Few basic queries :-
> 1. Is there a way we can control the number of threads per executor core?
> 2. Does this parameter “executor-cores” also has say in deciding how many
> threads to be run?
>
> Regards,
> Sam
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Relation-between-threads-and-executor-core-tp24456.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

RE: Relation between threads and executor core

Posted by Samya MAITI <Sa...@amadeus.com>.
Thanks Jem, I do understand your suggestion. Actually --executor-cores alone doesn’t control the number of tasks, but is also governed by spark.task.cpus (amount of cores dedicated for each task’s execution).

Reframing my Question, How many threads can be spawned per executor core? Is it in user control?

Regards,
Sam

From: Jem Tucker [mailto:jem.tucker@gmail.com]
Sent: Wednesday, August 26, 2015 2:26 PM
To: Samya MAITI <Sa...@amadeus.com>; user@spark.apache.org
Subject: Re: Relation between threads and executor core

Hi Samya,

When submitting an application with spark-submit the cores per executor can be set with --executor-cores, meaning you can run that many tasks per executor concurrently. The page below has some more details on submitting applications:

https://spark.apache.org/docs/latest/submitting-applications.html

thanks,

Jem

On Wed, Aug 26, 2015 at 9:47 AM Samya <sa...@amadeus.com>> wrote:
Hi All,

Few basic queries :-
1. Is there a way we can control the number of threads per executor core?
2. Does this parameter “executor-cores” also has say in deciding how many
threads to be run?

Regards,
Sam



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Relation-between-threads-and-executor-core-tp24456.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org<ma...@spark.apache.org>
For additional commands, e-mail: user-help@spark.apache.org<ma...@spark.apache.org>

Re: Relation between threads and executor core

Posted by Jem Tucker <je...@gmail.com>.
Hi Samya,

When submitting an application with spark-submit the cores per executor can
be set with --executor-cores, meaning you can run that many tasks per
executor concurrently. The page below has some more details on submitting
applications:

https://spark.apache.org/docs/latest/submitting-applications.html

thanks,

Jem

On Wed, Aug 26, 2015 at 9:47 AM Samya <sa...@amadeus.com> wrote:

> Hi All,
>
> Few basic queries :-
> 1. Is there a way we can control the number of threads per executor core?
> 2. Does this parameter “executor-cores” also has say in deciding how many
> threads to be run?
>
> Regards,
> Sam
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Relation-between-threads-and-executor-core-tp24456.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>