You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Eric Bless <er...@yahoo.com.INVALID> on 2015/08/11 23:40:38 UTC

Boosting spark.yarn.executor.memoryOverhead

Previously I was getting a failure which included the message     Container killed by YARN for exceeding memory limits. 2.1 GB of 2 GB physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.

So I attempted the following -     spark-submit --jars examples.jar latest_msmtdt_by_gridid_and_source.py --conf spark.yarn.executor.memoryOverhead=1024 host table

This resulted in -    Application application_1438983806434_24070 failed 2 times due to AM Container for appattempt_1438983806434_24070_000002 exited with exitCode: -1000

Am I specifying the spark.yarn.executor.memoryOverhead incorrectly?

Re: Boosting spark.yarn.executor.memoryOverhead

Posted by Sandy Ryza <sa...@cloudera.com>.
Hi Eric,

This is likely because you are putting the parameter after the primary
resource (latest_msmtdt_by_gridid_and_source.py), which makes it a
parameter to your application instead of a parameter to Spark/

-Sandy

On Wed, Aug 12, 2015 at 4:40 AM, Eric Bless <er...@yahoo.com.invalid>
wrote:

> Previously I was getting a failure which included the message
>     Container killed by YARN for exceeding memory limits. 2.1 GB of 2 GB
> physical memory used. Consider boosting spark.yarn.executor.memoryOverhead.
>
> So I attempted the following -
>     spark-submit --jars examples.jar latest_msmtdt_by_gridid_and_source.py
> --conf spark.yarn.executor.memoryOverhead=1024 host table
>
> This resulted in -
>     Application application_1438983806434_24070 failed 2 times due to AM
> Container for appattempt_1438983806434_24070_000002 exited with exitCode:
> -1000
>
> Am I specifying the spark.yarn.executor.memoryOverhead incorrectly?
>
>