You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Ruijing Li <li...@gmail.com> on 2020/02/15 15:37:33 UTC

Spark 2.4.4 has bigger memory impact than 2.3?

Hi all,

We recently upgraded to our jobs to spark 2.4.4 from 2.3 and noticed that
some jobs are failing due to lack of resources - particularly lack of
executor memory causing some executors to fail. However, no code change was
made other than the upgrade. Does spark 2.4.4 require more executor memory
than previous versions of spark? I’d be interested to know if anyone else
has this issue. We are on scala 2.11.12 on java 8
-- 
Cheers,
Ruijing Li