You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by hmushtaq <pe...@gmail.com> on 2014/11/08 23:17:45 UTC

Do spark works on multicore systems?

I am a Spark newbie and I use python (pyspark). I am trying to run a program
on a 64 core system, but no matter what I do, it always use 1 core. It
doesn't matter if I run it using "spark-submit --master local[64] run.sh" or
I call x.repartition(64) in my code with an RDD, the spark program always
use one core. Has anyone experience of running spark programs on multicore
processors with success? Can someone provide me a very simple example that
does properly run on all cores of a multicore system? 



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Do-spark-works-on-multicore-systems-tp18419.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Do spark works on multicore systems?

Posted by lalit1303 <la...@sigmoidanalytics.com>.
While creating sparkConf, set the variable *"spark.cores.max"* to
th"spark.cores.max maximum number of cores to be used by spark job.
By default it is set to 1.



-----
Lalit Yadav
lalit@sigmoidanalytics.com
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Do-spark-works-on-multicore-systems-tp18419p18459.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org