You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Yunmeng Ban <ba...@gmail.com> on 2014/06/02 04:36:46 UTC

Can anyone help me set memory for standalone cluster?

Hi,

I'm running the example of JavaKafkaWordCount in a standalone cluster. I
want to set 1600MB memory for each slave node. I wrote in the
spark/conf/spark-env.sh

SPARK_WORKER_MEMORY=1600m

But the logs on slave nodes looks this:
Spark Executor Command: "/usr/java/latest/bin/java" "-cp"
":/~path/spark/conf:/~path/spark/assembly/target/scala-2.10/spark-assembly_2.10-0.9.1-hadoop2.2.0.jar"
"-Xms512M" "-Xmx512M"
"org.apache.spark.executor.CoarseGrainedExecutorBackend"

The memory seems to be the default number, not 1600M.
I don't how to make SPARK_WORKER_MEMORY work.
Can anyone help me?
Many thanks in advance.

Yunmeng

Re: Can anyone help me set memory for standalone cluster?

Posted by Aaron Davidson <il...@gmail.com>.
In addition to setting the Standalone memory, you'll also need to tell your
SparkContext to claim the extra resources. Set "spark.executor.memory" to
1600m as well. This should be a system property set in SPARK_JAVA_OPTS in
conf/spark-env.sh (in 0.9.1, which you appear to be using) -- e.g.,
export SPARK_JAVA_OPTS="-Dspark.executor.memory=1600mb"


On Sun, Jun 1, 2014 at 7:36 PM, Yunmeng Ban <ba...@gmail.com> wrote:

> Hi,
>
> I'm running the example of JavaKafkaWordCount in a standalone cluster. I
> want to set 1600MB memory for each slave node. I wrote in the
> spark/conf/spark-env.sh
>
> SPARK_WORKER_MEMORY=1600m
>
> But the logs on slave nodes looks this:
> Spark Executor Command: "/usr/java/latest/bin/java" "-cp"
> ":/~path/spark/conf:/~path/spark/assembly/target/scala-2.10/spark-assembly_2.10-0.9.1-hadoop2.2.0.jar"
> "-Xms512M" "-Xmx512M"
> "org.apache.spark.executor.CoarseGrainedExecutorBackend"
>
> The memory seems to be the default number, not 1600M.
> I don't how to make SPARK_WORKER_MEMORY work.
> Can anyone help me?
> Many thanks in advance.
>
> Yunmeng
>