You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/04/29 17:13:56 UTC

[GitHub] [spark] vanzin commented on issue #24461: [SPARK-27434][CORE] Fix mem leak

vanzin commented on issue #24461: [SPARK-27434][CORE] Fix mem leak 
URL: https://github.com/apache/spark/pull/24461#issuecomment-487665032
 
 
   Before this goes too far.
   
   You must not close FileSystem instances. The app itself can keep running after the SparkContext is stopped, if you want one reason.
   
   This is not a fix for the memory leak. It's at best a workaround that will only work in very specific cases.
   
   To fix the memory leak you need to understand where it's coming from. You just identified that it is happening, but not why or how. It may be it's not the fault of Spark at all and there's something wrong with the Hadoop code itself.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org