You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by io...@barclays.com on 2013/11/19 12:33:14 UTC

How to start spark Workers with fixed number of Cores?

I am trying to start 12 workers with 2 cores on each Node using the following:

In "spark-env.sh" (copied in every slave) I have set:
SPARK_WORKER_INSTANCES=12
SPARK_WORKER_CORES=2

I start Scala console with:
>SPARK_WORKER_CORES=2 SPARK_MEM=3g MASTER=spark://xxxxx:7077 /apps/spark/spark-0.8.0-incubating-bin-cdh4/spark-shell

In the console logs I get:
Executor added: app-20131119062048-0000/241 on worker-20131119062018-xxxxx-49726 (xxxxx:49726) with 24 cores
Which, as I understand launches each worker with 24 cores.

Also I also get this Error before Scala consoled dies..
13/11/19 06:20:49 INFO client.Client$ClientActor: Executor updated: app-20131119062048-0000/156 is now FAILED (class java.io.IOException: Cannot run program "/apps/spark/spark-0.8.0-incubating-bin-cdh4/bin/compute-classpath.sh" (in directory "."): error=11, Resource temporarily unavailable)

Pls advise...

Regards,

Ioannis Deligiannis


_______________________________________________

This message is for information purposes only, it is not a recommendation, advice, offer or solicitation to buy or sell a product or service nor an official confirmation of any transaction. It is directed at persons who are professionals and is not intended for retail customer use. Intended for recipient only. This message is subject to the terms at: www.barclays.com/emaildisclaimer.

For important disclosures, please see: www.barclays.com/salesandtradingdisclaimer regarding market commentary from Barclays Sales and/or Trading, who are active market participants; and in respect of Barclays Research, including disclosures relating to specific issuers, please see http://publicresearch.barclays.com.

_______________________________________________