You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nuno Azevedo (JIRA)" <ji...@apache.org> on 2018/09/27 10:19:00 UTC

[jira] [Updated] (SPARK-25552) Upgrade from Spark 1.6.3 to 2.3.0 seems to make jobs use about 50% more memory

     [ https://issues.apache.org/jira/browse/SPARK-25552?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Nuno Azevedo updated SPARK-25552:
---------------------------------
    Attachment: Spark2.3-50GB.png

> Upgrade from Spark 1.6.3 to 2.3.0 seems to make jobs use about 50% more memory
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-25552
>                 URL: https://issues.apache.org/jira/browse/SPARK-25552
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.0
>         Environment: AWS Kubernetes
> Spark Embedded
>            Reporter: Nuno Azevedo
>            Priority: Major
>         Attachments: Spark1.6-50GB.png, Spark2.3-50GB.png, Spark2.3-70GB.png
>
>
> After upgrading from Spark 1.6.3 to 2.3.0 our jobs started to need about 50% more memory to run.
>  
> For instance, before we were running a job with Spark 1.6.3 and it was running fine with 50 GB of memory.
> !image-2018-09-27-11-00-28-697.png|width=580,height=330!
>  
> After upgrading to Spark 2.3.0, when running the same job again with the same 50 GB of memory it failed due to out of memory.
> !image-2018-09-27-11-02-52-164.png|width=580,height=265!
>  
> Then, we started incrementing the memory until we were able to run the job, which was with 70 GB.
> !image-2018-09-27-11-04-06-484.png|width=580,height=265!
>  
> The Spark upgrade was the only change in our environment. After taking a look at what seems to be causing this we noticed that Kryo Serializer is the main culprit for the raise in memory consumption.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org