You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wilfred Spiegelenburg (JIRA)" <ji...@apache.org> on 2014/09/11 07:24:33 UTC
[jira] [Commented] (SPARK-1719) spark.executor.extraLibraryPath
isn't applied on yarn
[ https://issues.apache.org/jira/browse/SPARK-1719?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14129644#comment-14129644 ]
Wilfred Spiegelenburg commented on SPARK-1719:
----------------------------------------------
I looked through the pull request linked here and the pull request that closed the linked one and I can not see any reference to the spark.executor.extraLibraryPath. I went through the trunk and the only place I can see it is in the mesos code.
Can you explain how the change in https://github.com/apache/spark/pull/1031 fixes the issue reported without referencing the setting at all?
> spark.executor.extraLibraryPath isn't applied on yarn
> -----------------------------------------------------
>
> Key: SPARK-1719
> URL: https://issues.apache.org/jira/browse/SPARK-1719
> Project: Spark
> Issue Type: Sub-task
> Components: YARN
> Affects Versions: 1.0.0
> Reporter: Thomas Graves
> Assignee: Guoqiang Li
> Fix For: 1.1.0
>
>
> Looking through the code for spark on yarn I don't see that spark.executor.extraLibraryPath is being properly applied when it launches executors. It is using the spark.driver.libraryPath in the ClientBase.
> Note I didn't actually test it so its possible I missed something.
> I also think better to use LD_LIBRARY_PATH rather then -Djava.library.path. once java.library.path is set, it doesn't search LD_LIBRARY_PATH. In Hadoop we switched to use LD_LIBRARY_PATH instead of java.library.path. See https://issues.apache.org/jira/browse/MAPREDUCE-4072. I'll split this into separate jira.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org