You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Aditya <ad...@augmentiq.co.in> on 2016/09/28 06:47:29 UTC
Spark Executor Lost issue
I have a spark job which runs fine for small data. But when data
increases it gives executor lost error.My executor and driver memory are
set at its highest point. I have also tried increasing--conf
spark.yarn.executor.memoryOverhead=600but still not able to fix the
problem. Is there any other solution to fix the problem?
Re: Spark Executor Lost issue
Posted by Sushrut Ikhar <su...@gmail.com>.
Can you add more details like are you using rdds/datasets/sql ..; are you
doing group by/ joins ; is your input splittable?
btw, you can pass the config the same way you are passing memryOverhead:
e.g.
--conf spark.default.parallelism=1000 or through spark-context in code
Regards,
Sushrut Ikhar
[image: https://]about.me/sushrutikhar
<https://about.me/sushrutikhar?promo=email_sig>
On Wed, Sep 28, 2016 at 7:30 PM, Aditya <ad...@augmentiq.co.in>
wrote:
> Hi All,
>
> Any updates on this?
>
> On Wednesday 28 September 2016 12:22 PM, Sushrut Ikhar wrote:
>
> Try with increasing the parallelism by repartitioning and also you may
> increase - spark.default.parallelism
> You can also try with decreasing num-executor cores.
> Basically, this happens when the executor is using quite large memory than
> it asked; and yarn kills the executor.
>
> Regards,
>
> Sushrut Ikhar
> [image: https://]about.me/sushrutikhar
> <https://about.me/sushrutikhar?promo=email_sig>
>
>
> On Wed, Sep 28, 2016 at 12:17 PM, Aditya <aditya.calangutkar@augmentiq.
> co.in> wrote:
>
>> I have a spark job which runs fine for small data. But when data
>> increases it gives executor lost error.My executor and driver memory are
>> set at its highest point. I have also tried increasing --conf
>> spark.yarn.executor.memoryOverhead=600 but still not able to fix the
>> problem. Is there any other solution to fix the problem?
>>
>>
>
>
>
Re: Spark Executor Lost issue
Posted by Aditya <ad...@augmentiq.co.in>.
Hi All,
Any updates on this?
On Wednesday 28 September 2016 12:22 PM, Sushrut Ikhar wrote:
> Try with increasing the parallelism by repartitioning and also you may
> increase - spark.default.parallelism
> You can also try with decreasing num-executor cores.
> Basically, this happens when the executor is using quite large memory
> than it asked; and yarn kills the executor.
>
> Regards,
>
> Sushrut Ikhar
> https://about.me/sushrutikhar
>
> <https://about.me/sushrutikhar?promo=email_sig>
>
>
> On Wed, Sep 28, 2016 at 12:17 PM, Aditya
> <aditya.calangutkar@augmentiq.co.in
> <ma...@augmentiq.co.in>> wrote:
>
> I have a spark job which runs fine for small data. But when data
> increases it gives executor lost error.My executor and driver
> memory are set at its highest point. I have also tried
> increasing--conf spark.yarn.executor.memoryOverhead=600but still
> not able to fix the problem. Is there any other solution to fix
> the problem?
>
>
Re: Spark Executor Lost issue
Posted by Aditya <ad...@augmentiq.co.in>.
Hi All,
Any updates on this?
On Wednesday 28 September 2016 12:22 PM, Sushrut Ikhar wrote:
> Try with increasing the parallelism by repartitioning and also you may
> increase - spark.default.parallelism
> You can also try with decreasing num-executor cores.
> Basically, this happens when the executor is using quite large memory
> than it asked; and yarn kills the executor.
>
> Regards,
>
> Sushrut Ikhar
> https://about.me/sushrutikhar
>
> <https://about.me/sushrutikhar?promo=email_sig>
>
>
> On Wed, Sep 28, 2016 at 12:17 PM, Aditya
> <aditya.calangutkar@augmentiq.co.in
> <ma...@augmentiq.co.in>> wrote:
>
> I have a spark job which runs fine for small data. But when data
> increases it gives executor lost error.My executor and driver
> memory are set at its highest point. I have also tried
> increasing--conf spark.yarn.executor.memoryOverhead=600but still
> not able to fix the problem. Is there any other solution to fix
> the problem?
>
>
Re: Spark Executor Lost issue
Posted by Aditya <ad...@augmentiq.co.in>.
:
> Thanks Sushrut for the reply.
>
> Currently I have not defined spark.default.parallelism property.
> Can you let me know how much should I set it to?
>
>
> Regards,
> Aditya Calangutkar
>
> On Wednesday 28 September 2016 12:22 PM, Sushrut Ikhar wrote:
>> Try with increasing the parallelism by repartitioning and also you
>> may increase - spark.default.parallelism
>> You can also try with decreasing num-executor cores.
>> Basically, this happens when the executor is using quite large memory
>> than it asked; and yarn kills the executor.
>>
>> Regards,
>>
>> Sushrut Ikhar
>> https://about.me/sushrutikhar
>>
>> <https://about.me/sushrutikhar?promo=email_sig>
>>
>>
>> On Wed, Sep 28, 2016 at 12:17 PM, Aditya
>> <aditya.calangutkar@augmentiq.co.in
>> <ma...@augmentiq.co.in>> wrote:
>>
>> I have a spark job which runs fine for small data. But when data
>> increases it gives executor lost error.My executor and driver
>> memory are set at its highest point. I have also tried
>> increasing--conf spark.yarn.executor.memoryOverhead=600but still
>> not able to fix the problem. Is there any other solution to fix
>> the problem?
>>
>>
>
>
Re: Spark Executor Lost issue
Posted by Aditya <ad...@augmentiq.co.in>.
:
> Thanks Sushrut for the reply.
>
> Currently I have not defined spark.default.parallelism property.
> Can you let me know how much should I set it to?
>
>
> Regards,
> Aditya Calangutkar
>
> On Wednesday 28 September 2016 12:22 PM, Sushrut Ikhar wrote:
>> Try with increasing the parallelism by repartitioning and also you
>> may increase - spark.default.parallelism
>> You can also try with decreasing num-executor cores.
>> Basically, this happens when the executor is using quite large memory
>> than it asked; and yarn kills the executor.
>>
>> Regards,
>>
>> Sushrut Ikhar
>> https://about.me/sushrutikhar
>>
>> <https://about.me/sushrutikhar?promo=email_sig>
>>
>>
>> On Wed, Sep 28, 2016 at 12:17 PM, Aditya
>> <aditya.calangutkar@augmentiq.co.in
>> <ma...@augmentiq.co.in>> wrote:
>>
>> I have a spark job which runs fine for small data. But when data
>> increases it gives executor lost error.My executor and driver
>> memory are set at its highest point. I have also tried
>> increasing--conf spark.yarn.executor.memoryOverhead=600but still
>> not able to fix the problem. Is there any other solution to fix
>> the problem?
>>
>>
>
>
Re: Spark Executor Lost issue
Posted by Aditya <ad...@augmentiq.co.in>.
Thanks Sushrut for the reply.
Currently I have not defined spark.default.parallelism property.
Can you let me know how much should I set it to?
Regards,
Aditya Calangutkar
On Wednesday 28 September 2016 12:22 PM, Sushrut Ikhar wrote:
> Try with increasing the parallelism by repartitioning and also you may
> increase - spark.default.parallelism
> You can also try with decreasing num-executor cores.
> Basically, this happens when the executor is using quite large memory
> than it asked; and yarn kills the executor.
>
> Regards,
>
> Sushrut Ikhar
> https://about.me/sushrutikhar
>
> <https://about.me/sushrutikhar?promo=email_sig>
>
>
> On Wed, Sep 28, 2016 at 12:17 PM, Aditya
> <aditya.calangutkar@augmentiq.co.in
> <ma...@augmentiq.co.in>> wrote:
>
> I have a spark job which runs fine for small data. But when data
> increases it gives executor lost error.My executor and driver
> memory are set at its highest point. I have also tried
> increasing--conf spark.yarn.executor.memoryOverhead=600but still
> not able to fix the problem. Is there any other solution to fix
> the problem?
>
>