You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Thodoris Zois <zo...@ics.forth.gr> on 2018/04/04 19:13:16 UTC

1 Executor per partition

Hello list!

I am trying to familiarize with Apache Spark. I  would like to ask something about partitioning and executors. 

Can I have e.g: 500 partitions but launch only one executor that will run operations in only 1 partition of the 500? And then I would like my job to die. 

Is there any easy way? Or i have to modify code to achieve that?

Thank you,
 Thodoris

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: 1 Executor per partition

Posted by utkarsh_deep <ut...@gmail.com>.
You are correct.



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: 1 Executor per partition

Posted by Gourav Sengupta <go...@gmail.com>.
Each partition should be translated into one task which should run in one
executor. But one executor can process more than one task. I may be wrong,
and will be grateful if someone can correct me.

Regards,
Gourav

On Wed, Apr 4, 2018 at 8:13 PM, Thodoris Zois <zo...@ics.forth.gr> wrote:

>
> Hello list!
>
> I am trying to familiarize with Apache Spark. I  would like to ask
> something about partitioning and executors.
>
> Can I have e.g: 500 partitions but launch only one executor that will run
> operations in only 1 partition of the 500? And then I would like my job to
> die.
>
> Is there any easy way? Or i have to modify code to achieve that?
>
> Thank you,
>  Thodoris
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>