You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:20:07 UTC

[jira] [Updated] (SPARK-8716) Write tests for executor shared cache feature

     [ https://issues.apache.org/jira/browse/SPARK-8716?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-8716:
--------------------------------
    Labels: bulk-closed  (was: )

> Write tests for executor shared cache feature
> ---------------------------------------------
>
>                 Key: SPARK-8716
>                 URL: https://issues.apache.org/jira/browse/SPARK-8716
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core, Tests
>    Affects Versions: 1.2.0
>            Reporter: Andrew Or
>            Priority: Major
>              Labels: bulk-closed
>
> More specifically, this is the feature that is currently flagged by `spark.files.useFetchCache`.
> This is a complicated feature that has no tests. I cannot say with confidence that it actually works on all cluster managers. In particular, I believe it doesn't work on Mesos because whatever goes into this else case creates its own temp directory per executor: https://github.com/apache/spark/blob/881662e9c93893430756320f51cef0fc6643f681/core/src/main/scala/org/apache/spark/util/Utils.scala#L739.
> It's also not immediately clear that it works on standalone mode due to the lack of comments. It actually does work there because the Worker happens to set a `SPARK_EXECUTOR_DIRS` variable. The linkage could be more explicitly documented in the code.
> This is difficult to write tests for, but it's still important to do so. Otherwise, semi-related changes in the future may easily break it without anyone noticing.
> Related issues: SPARK-8130, SPARK-6313, SPARK-2713



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org