You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Du Li <li...@yahoo-inc.com.INVALID> on 2015/03/12 01:42:32 UTC

Re: How to use more executors

Is it being merged in the next release? It's indeed a critical patch!
Du 

     On Wednesday, January 21, 2015 3:59 PM, Nan Zhu <zh...@gmail.com> wrote:
   

 …not sure when will it be reviewed…
but for now you can work around by allowing multiple worker instances on a single machine 
http://spark.apache.org/docs/latest/spark-standalone.html
search SPARK_WORKER_INSTANCES
Best, 
-- Nan Zhuhttp://codingcat.me On Wednesday, January 21, 2015 at 6:50 PM, Larry Liu wrote: 
 Will  SPARK-1706 be included in next release?
On Wed, Jan 21, 2015 at 2:50 PM, Ted Yu <yu...@gmail.com> wrote:

Please see SPARK-1706
On Wed, Jan 21, 2015 at 2:43 PM, Larry Liu <la...@gmail.com> wrote:

I tried to submit a job with  --conf "spark.cores.max=6"  or --total-executor-cores 6 on a standalone cluster. But I don't see more than 1 executor on each worker. I am wondering how to use multiple executors when submitting jobs.
Thankslarry



 
  
 

   

Re: How to use more executors

Posted by Nan Zhu <zh...@gmail.com>.
at least 1.4 I think  

now using YARN or allowing multiple worker instances are just fine

Best,  

--  
Nan Zhu
http://codingcat.me


On Wednesday, March 11, 2015 at 8:42 PM, Du Li wrote:

> Is it being merged in the next release? It's indeed a critical patch!
>  
> Du  
>  
>  
> On Wednesday, January 21, 2015 3:59 PM, Nan Zhu <zhunanmcgill@gmail.com (mailto:zhunanmcgill@gmail.com)> wrote:
>  
>  
> …not sure when will it be reviewed…
>  
> but for now you can work around by allowing multiple worker instances on a single machine  
>  
> http://spark.apache.org/docs/latest/spark-standalone.html
>  
> search SPARK_WORKER_INSTANCES
>  
> Best,  
>  
> --  
> Nan Zhu
> http://codingcat.me
>  
> On Wednesday, January 21, 2015 at 6:50 PM, Larry Liu wrote:
> > Will  SPARK-1706 be included in next release?
> >  
> > On Wed, Jan 21, 2015 at 2:50 PM, Ted Yu <yuzhihong@gmail.com (mailto:yuzhihong@gmail.com)> wrote:
> > > Please see SPARK-1706
> > >  
> > > On Wed, Jan 21, 2015 at 2:43 PM, Larry Liu <larryliu05@gmail.com (mailto:larryliu05@gmail.com)> wrote:
> > > > I tried to submit a job with  --conf "spark.cores.max=6"  or --total-executor-cores 6 on a standalone cluster. But I don't see more than 1 executor on each worker. I am wondering how to use multiple executors when submitting jobs.
> > > >  
> > > > Thanks
> > > > larry
> > > >  
> > > >  
> > >  
> > >  
> > >  
> >  
>  
>  
>