You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by witgo <gi...@git.apache.org> on 2014/10/08 15:36:14 UTC

[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

GitHub user witgo opened a pull request:

    https://github.com/apache/spark/pull/2711

    [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH instead of -Djava.library.path

    This is another implementation about #1031

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/witgo/spark SPARK-1719

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/2711.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #2711
    
----
commit 033201d6a7f9c2d48f67613715013386313783b6
Author: GuoQiang Li <wi...@qq.com>
Date:   2014-10-08T13:32:23Z

    use LD_LIBRARY_PATH instead of -Djava.library.path

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH i...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-60946636
  
      [Test build #22452 has started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/22452/consoleFull) for   PR 2711 at commit [`4488e41`](https://github.com/apache/spark/commit/4488e417e9d5947ecaa7792ef347c781cabae774).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-59395057
  
    @andrewor14 @tgravescs any chance you guys could take a look at this? Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58909673
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21690/consoleFull) for   PR 2711 at commit [`fcf7cf1`](https://github.com/apache/spark/commit/fcf7cf1c8be05e7369b1de6d42ebf164b5cc3b61).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58910236
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/21691/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH i...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-60960674
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/22452/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19491737
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala ---
    @@ -165,10 +157,11 @@ private[spark] class DriverRunner(
         localJarFilename
       }
     
    -  private def launchDriver(command: Seq[String], envVars: Map[String, String], baseDir: File,
    -                           supervise: Boolean) {
    +  private def launchDriver(newCommand: Command, baseDir: File, supervise: Boolean) {
    +    val command = CommandUtils.buildCommandSeq(newCommand, driverDesc.mem,
    +      sparkHome.getAbsolutePath)
    --- End diff --
    
    I don't think you should call `buildCommandSeq` after `buildLocalCommand`. As expressed elsewhere, these two should just be one function.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19254447
  
    --- Diff: core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala ---
    @@ -120,16 +121,16 @@ private[spark] class CoarseMesosSchedulerBackend(
           environment.addVariables(
             Environment.Variable.newBuilder().setName("SPARK_CLASSPATH").setValue(cp).build())
         }
    -    val extraJavaOpts = conf.getOption("spark.executor.extraJavaOptions")
    +    val extraJavaOpts = conf.getOption("spark.executor.extraJavaOptions").getOrElse("")
    --- End diff --
    
    just do `conf.get("...", "")`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58479612
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21530/consoleFull) for   PR 2711 at commit [`a69bca1`](https://github.com/apache/spark/commit/a69bca1ee44329f1d05694baa44d9e902ed640ef).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58378797
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21467/consoleFull) for   PR 2711 at commit [`4cc826c`](https://github.com/apache/spark/commit/4cc826c3c5aed92af4a27f022499e9d0d522bf99).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19256890
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala ---
    @@ -130,6 +121,14 @@ private[spark] object SparkSubmitDriverBootstrapper {
         // Start the driver JVM
         val filteredCommand = command.filter(_.nonEmpty)
         val builder = new ProcessBuilder(filteredCommand)
    +    val env = builder.environment()
    +
    +    // SPARK_SUBMIT_LIBRARY_PATH is already captured in JAVA_OPTS
    +    if (submitLibraryPath.isEmpty && confLibraryPath.nonEmpty) {
    +      val libraryPaths = confLibraryPath ++ sys.env.get(Utils.libraryPathName)
    +      env.put(Utils.libraryPathName, libraryPaths.mkString(sys.props("path.separator")))
    --- End diff --
    
    Yes, they look the same, in the Java source `UnixFileSystem.java` 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19492252
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -1684,6 +1689,34 @@ private[spark] object Utils extends Logging {
         method.invoke(obj, values.toSeq: _*)
       }
     
    +  /**
    +   * Return the current system LD_LIBRARY_PATH name
    +   */
    +  def libraryPathName: String = {
    +    if (isWindows) {
    +      "PATH"
    +    } else if (isMac) {
    +      "DYLD_LIBRARY_PATH"
    +    } else {
    +      "LD_LIBRARY_PATH"
    +    }
    +  }
    +
    +  def prefixLibraryPath(libraryPaths: Seq[String]): String = {
    +    val libraryPathScriptVar = if (isWindows) {
    +      s"%${libraryPathName}%"
    +    } else {
    +      "$" + libraryPathName
    +    }
    +    val libraryPath = (libraryPaths :+ libraryPathScriptVar).mkString(File.pathSeparator)
    +    val ampersand = if (Utils.isWindows) {
    +      " &"
    +    } else {
    +      ""
    +    }
    +    s"$libraryPathName=$libraryPath$ampersand"
    --- End diff --
    
    Do you need to escape the paths? What if there are spaces in the directory names? What I mean is:
    ```
    LD_LIBRARY_PATH="path with space:path2:$LD_LIBRARY_PATH"
    ```
    as opposed to
    ```
    LD_LIBRARY_PATH=path with space:path2:$LD_LIBRARY_PATH
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58910225
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21691/consoleFull) for   PR 2711 at commit [`c18ff6a`](https://github.com/apache/spark/commit/c18ff6a3be99f52776736890233beaa0e86a770a).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-60807773
  
    Hey @witgo thanks for the refactor. I find the new interface much less confusing, i.e. if the developer accidentally uses `command.environment` he/she won't leave out the library path. However, I think the existing code still has a potential source of confusion. There is a `buildCommandSeq` and a `buildLocalCommand`, and it's not intuitive at all which on we should use. It seems that your code frequently just calls one after the other, and I think we can just combine these into one function to simplify things. Other than that I left a few naming and comment suggestions.
    
    I think we shouldn't block on Windows for this patch. AFAIK very few people run Spark on Windows on Yarn, and they probably run into other existing problems setting it up before trying to set library paths anyway. That said this is something we should fix this in the future.
    
    Also, could you remove `[WIP]` from the title? I think this is pretty close to being merged.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19492613
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -1684,6 +1689,34 @@ private[spark] object Utils extends Logging {
         method.invoke(obj, values.toSeq: _*)
       }
     
    +  /**
    +   * Return the current system LD_LIBRARY_PATH name
    +   */
    +  def libraryPathName: String = {
    +    if (isWindows) {
    +      "PATH"
    +    } else if (isMac) {
    +      "DYLD_LIBRARY_PATH"
    +    } else {
    +      "LD_LIBRARY_PATH"
    +    }
    +  }
    +
    +  def prefixLibraryPath(libraryPaths: Seq[String]): String = {
    +    val libraryPathScriptVar = if (isWindows) {
    +      s"%${libraryPathName}%"
    +    } else {
    +      "$" + libraryPathName
    +    }
    +    val libraryPath = (libraryPaths :+ libraryPathScriptVar).mkString(File.pathSeparator)
    +    val ampersand = if (Utils.isWindows) {
    +      " &"
    --- End diff --
    
    Wait sorry, the link you provided doesn't say anything about the ampersand. Did you provide the correct one?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19257017
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -38,6 +40,19 @@ object CommandUtils extends Logging {
           command.arguments
       }
     
    +  def buildEnvironment(command: Command,
    +    env: Map[String, String] = sys.env): Map[String, String] = {
    +    val libraryPathEntries = command.libraryPathEntries
    +    if (libraryPathEntries.nonEmpty) {
    +      val libraryPathName = Utils.libraryPathName
    +      val cmdLibraryPath = command.environment.get(libraryPathName)
    +      val libraryPaths = libraryPathEntries ++ cmdLibraryPath ++ env.get(libraryPathName)
    +      command.environment + ((libraryPathName, libraryPaths.mkString(File.pathSeparator)))
    +    } else {
    +      command.environment
    --- End diff --
    
    If it equals `$LD_LIBRARY_PATH`, we don't need to set it


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r18598450
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -34,8 +34,17 @@ object CommandUtils extends Logging {
     
         // SPARK-698: do not call the run.cmd script, as process.destroy()
         // fails to kill a process tree on Windows
    -    Seq(runner) ++ buildJavaOpts(command, memory, sparkHome) ++ Seq(command.mainClass) ++
    -      command.arguments
    +    buildPrefixEnv(command) ++ Seq(runner) ++ buildJavaOpts(command, memory, sparkHome) ++
    +      Seq(command.mainClass) ++ command.arguments
    +  }
    +
    +  private def buildPrefixEnv(command: Command): Seq[String] = {
    +    if (command.libraryPathEntries.size > 0) {
    +      val libraryPaths = command.libraryPathEntries
    +      Seq(s"${Utils.libraryPath}=${libraryPaths.mkString(File.pathSeparator)}")
    --- End diff --
    
    Just an observation that this won't work if the launcher process and target machine are running different OSes. Not sure whether that's supported, and I'm sure this is done in other places, but I've seen reports of people being bit by this (especially those using Windows).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58482738
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21532/consoleFull) for   PR 2711 at commit [`880e06d`](https://github.com/apache/spark/commit/880e06de3b4022f8113251e43c3e28ab2d38a35e).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19491898
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -1684,6 +1689,34 @@ private[spark] object Utils extends Logging {
         method.invoke(obj, values.toSeq: _*)
       }
     
    +  /**
    +   * Return the current system LD_LIBRARY_PATH name
    +   */
    +  def libraryPathName: String = {
    +    if (isWindows) {
    +      "PATH"
    +    } else if (isMac) {
    +      "DYLD_LIBRARY_PATH"
    +    } else {
    +      "LD_LIBRARY_PATH"
    +    }
    +  }
    +
    +  def prefixLibraryPath(libraryPaths: Seq[String]): String = {
    --- End diff --
    
    I would rename this `libraryPathEnvPrefix`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58899036
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21690/consoleFull) for   PR 2711 at commit [`fcf7cf1`](https://github.com/apache/spark/commit/fcf7cf1c8be05e7369b1de6d42ebf164b5cc3b61).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH i...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19535120
  
    --- Diff: bin/spark-class ---
    @@ -81,7 +81,11 @@ case "$1" in
         OUR_JAVA_OPTS="$SPARK_JAVA_OPTS $SPARK_SUBMIT_OPTS"
         OUR_JAVA_MEM=${SPARK_DRIVER_MEMORY:-$DEFAULT_MEM}
         if [ -n "$SPARK_SUBMIT_LIBRARY_PATH" ]; then
    -      OUR_JAVA_OPTS="$OUR_JAVA_OPTS -Djava.library.path=$SPARK_SUBMIT_LIBRARY_PATH"
    +      if [[ $OSTYPE == darwin* ]]; then
    --- End diff --
    
    cygwin seems to correctly handle the LD_LIBRARY_PATH environment variable,see:
    https://cygwin.com/cygwin-ug-net/setup-env.html


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19254265
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala ---
    @@ -86,7 +86,8 @@ private[spark] class DriverRunner(
                 driverDesc.command.javaOpts)
               val command = CommandUtils.buildCommandSeq(newCommand, driverDesc.mem,
                 sparkHome.getAbsolutePath)
    -          launchDriver(command, driverDesc.command.environment, driverDir, driverDesc.supervise)
    +          val envVars = CommandUtils.buildEnvironment(newCommand)
    +          launchDriver(command, envVars, driverDir, driverDesc.supervise)
    --- End diff --
    
    If you make the suggested change removing `buildEnvironment` then this doesn't need to change.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19491063
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -38,6 +40,31 @@ object CommandUtils extends Logging {
           command.arguments
       }
     
    +  def buildLocalCommand(
    --- End diff --
    
    Can you add a java doc, something like
    ```
    /**
     * Build a command based on the given one, taking into account the local environment of where
     * this command is expected to run, substitute any placeholders, and append any extra class paths.
     * The `env` argument is exposed for testing.
     */
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r18663724
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -38,6 +40,20 @@ object CommandUtils extends Logging {
           command.arguments
       }
     
    +  def buildEnvironment(command: Command): Map[String, String] = {
    --- End diff --
    
    I feel like this would be better in either of the following forms:
    
    * If `Command.environment` already did this automatically, but I don't know if you can do that in a case class constructor
    * If this method were in the `Command` class itself and the `environment` field were private.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19254497
  
    --- Diff: yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientBase.scala ---
    @@ -385,7 +390,7 @@ private[spark] trait ClientBase extends Logging {
             "--num-executors ", args.numExecutors.toString)
     
         // Command for the ApplicationMaster
    -    val commands = Seq(Environment.JAVA_HOME.$() + "/bin/java", "-server") ++
    +    val commands = prefixEnv ++ Seq(Environment.JAVA_HOME.$() + "/bin/java", "-server") ++
    --- End diff --
    
    Same here. Why not just set this through `amContainer.setEnvironment`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r18686842
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -38,6 +40,20 @@ object CommandUtils extends Logging {
           command.arguments
       }
     
    +  def buildEnvironment(command: Command): Map[String, String] = {
    --- End diff --
    
    Now, there is no build command line code in `Command `.  Most of the relevant code in `CommandUtils `.  I think it is reasonable to code in here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19490255
  
    --- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
    @@ -244,6 +244,18 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging {
         val executorClasspathKey = "spark.executor.extraClassPath"
         val driverOptsKey = "spark.driver.extraJavaOptions"
         val driverClassPathKey = "spark.driver.extraClassPath"
    +    val driverLibraryPathKey = "spark.driver.extraLibraryPath"
    +
    +    sys.props.get("spark.driver.libraryPath").foreach { value =>
    --- End diff --
    
    minor: can you add a comment above this
    ```
    // Used by Yarn in 1.1 and before
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58540975
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/21539/Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19253002
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala ---
    @@ -130,6 +121,14 @@ private[spark] object SparkSubmitDriverBootstrapper {
         // Start the driver JVM
         val filteredCommand = command.filter(_.nonEmpty)
         val builder = new ProcessBuilder(filteredCommand)
    +    val env = builder.environment()
    +
    +    // SPARK_SUBMIT_LIBRARY_PATH is already captured in JAVA_OPTS
    --- End diff --
    
    This comment no longer makes sense.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58529961
  
    Test FAILed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/21537/Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19492649
  
    --- Diff: yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientBase.scala ---
    @@ -385,7 +390,7 @@ private[spark] trait ClientBase extends Logging {
             "--num-executors ", args.numExecutors.toString)
     
         // Command for the ApplicationMaster
    -    val commands = Seq(Environment.JAVA_HOME.$() + "/bin/java", "-server") ++
    +    val commands = prefixEnv ++ Seq(Environment.JAVA_HOME.$() + "/bin/java", "-server") ++
    --- End diff --
    
    I see


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH i...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-60960664
  
      [Test build #22452 has finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/22452/consoleFull) for   PR 2711 at commit [`4488e41`](https://github.com/apache/spark/commit/4488e417e9d5947ecaa7792ef347c781cabae774).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19417726
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -17,6 +17,8 @@
     
     package org.apache.spark.deploy.worker
     
    +import scala.collection.Map
    --- End diff --
    
    super nit: should go after Java imports.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19417679
  
    --- Diff: bin/spark-class ---
    @@ -81,7 +81,11 @@ case "$1" in
         OUR_JAVA_OPTS="$SPARK_JAVA_OPTS $SPARK_SUBMIT_OPTS"
         OUR_JAVA_MEM=${SPARK_DRIVER_MEMORY:-$DEFAULT_MEM}
         if [ -n "$SPARK_SUBMIT_LIBRARY_PATH" ]; then
    -      OUR_JAVA_OPTS="$OUR_JAVA_OPTS -Djava.library.path=$SPARK_SUBMIT_LIBRARY_PATH"
    +      if [[ $OSTYPE == darwin* ]]; then
    --- End diff --
    
    Not sure whether this is supposed to work on Windows, but there's a check for cygwin in this file, maybe it should be tested here too?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19254096
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala ---
    @@ -130,6 +121,14 @@ private[spark] object SparkSubmitDriverBootstrapper {
         // Start the driver JVM
         val filteredCommand = command.filter(_.nonEmpty)
         val builder = new ProcessBuilder(filteredCommand)
    +    val env = builder.environment()
    +
    +    // SPARK_SUBMIT_LIBRARY_PATH is already captured in JAVA_OPTS
    +    if (submitLibraryPath.isEmpty && confLibraryPath.nonEmpty) {
    +      val libraryPaths = confLibraryPath ++ sys.env.get(Utils.libraryPathName)
    +      env.put(Utils.libraryPathName, libraryPaths.mkString(sys.props("path.separator")))
    --- End diff --
    
    Is `sys.props("path.separator")` always the same as `File.pathSeparator`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19485521
  
    --- Diff: bin/spark-class ---
    @@ -81,7 +81,11 @@ case "$1" in
         OUR_JAVA_OPTS="$SPARK_JAVA_OPTS $SPARK_SUBMIT_OPTS"
         OUR_JAVA_MEM=${SPARK_DRIVER_MEMORY:-$DEFAULT_MEM}
         if [ -n "$SPARK_SUBMIT_LIBRARY_PATH" ]; then
    -      OUR_JAVA_OPTS="$OUR_JAVA_OPTS -Djava.library.path=$SPARK_SUBMIT_LIBRARY_PATH"
    +      if [[ $OSTYPE == darwin* ]]; then
    --- End diff --
    
    Perhaps, I just mentioned because I saw cygwin-related code in this file.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19253220
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala ---
    @@ -130,6 +121,14 @@ private[spark] object SparkSubmitDriverBootstrapper {
         // Start the driver JVM
         val filteredCommand = command.filter(_.nonEmpty)
         val builder = new ProcessBuilder(filteredCommand)
    +    val env = builder.environment()
    +
    +    // SPARK_SUBMIT_LIBRARY_PATH is already captured in JAVA_OPTS
    +    if (submitLibraryPath.isEmpty && confLibraryPath.nonEmpty) {
    +      val libraryPaths = confLibraryPath ++ sys.env.get(Utils.libraryPathName)
    --- End diff --
    
    We shouldn't try to merge these two. We should let the conf one override the environment variable one. More generally, I think the ordering should be:
    ```
    Spark submit argument > conf value > environment variable
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH i...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-61037910
  
    I tested the standalone mode in the commit  [4488e417e9d5947ecaa7792ef347c781cabae774]. 
    And I'm doing more detailed test.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19254244
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -38,6 +40,19 @@ object CommandUtils extends Logging {
           command.arguments
       }
     
    +  def buildEnvironment(command: Command,
    +    env: Map[String, String] = sys.env): Map[String, String] = {
    +    val libraryPathEntries = command.libraryPathEntries
    +    if (libraryPathEntries.nonEmpty) {
    +      val libraryPathName = Utils.libraryPathName
    +      val cmdLibraryPath = command.environment.get(libraryPathName)
    +      val libraryPaths = libraryPathEntries ++ cmdLibraryPath ++ env.get(libraryPathName)
    +      command.environment + ((libraryPathName, libraryPaths.mkString(File.pathSeparator)))
    +    } else {
    +      command.environment
    +    }
    +  }
    +
    --- End diff --
    
    The semantics of `buildEnvironment` is very strange to me. It's not intuitive to me at all why the user should use `CommandUtils.buildEnvironment(command)` instead of `command.environment`. I think we should add the library path when we create the `Command` in the first place (e.g. in `SparkDeploySchedulerBackend`).
    
    (Same for `buildJavaOpts`, which should really be private since it's used only in the same file in 1 place)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58358351
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21462/consoleFull) for   PR 2711 at commit [`033201d`](https://github.com/apache/spark/commit/033201d6a7f9c2d48f67613715013386313783b6).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58486265
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/21530/Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19490195
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -38,6 +40,31 @@ object CommandUtils extends Logging {
           command.arguments
       }
     
    +  def buildLocalCommand(
    +    command: Command,
    +    substituteArguments: String => String,
    +    classPath: Seq[String] = Seq[String](),
    +    env: Map[String, String] = sys.env): Command = {
    --- End diff --
    
    indent these 4 lines by 2 spaces


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH i...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19516968
  
    --- Diff: yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientBase.scala ---
    @@ -393,6 +396,8 @@ private[spark] trait ClientBase extends Logging {
         // TODO: it would be nicer to just make sure there are no null commands here
         val printableCommands = commands.map(s => if (s == null) "null" else s).toList
         amContainer.setCommands(printableCommands)
    +    amContainer.setLocalResources(localResources)
    +    amContainer.setEnvironment(launchEnv)
    --- End diff --
    
    This is the legacy of changes, should restore back.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58540965
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21539/consoleFull) for   PR 2711 at commit [`604a37b`](https://github.com/apache/spark/commit/604a37b6172620de505578025790e2f72eb845f2).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58489217
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/21532/Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19491341
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/DriverRunner.scala ---
    @@ -76,17 +76,9 @@ private[spark] class DriverRunner(
     
               // Make sure user application jar is on the classpath
               // TODO: If we add ability to submit multiple jars they should also be added here
    -          val classPath = driverDesc.command.classPathEntries ++ Seq(s"$localJarFilename")
    -          val newCommand = Command(
    -            driverDesc.command.mainClass,
    -            driverDesc.command.arguments.map(substituteVariables),
    -            driverDesc.command.environment,
    -            classPath,
    -            driverDesc.command.libraryPathEntries,
    -            driverDesc.command.javaOpts)
    -          val command = CommandUtils.buildCommandSeq(newCommand, driverDesc.mem,
    -            sparkHome.getAbsolutePath)
    -          launchDriver(command, driverDesc.command.environment, driverDir, driverDesc.supervise)
    +          val newCommand = CommandUtils.buildLocalCommand(driverDesc.command,
    +            substituteVariables, Seq(s"$localJarFilename"))
    --- End diff --
    
    You can just do `Seq(localJarFilename)`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58555003
  
    Looks good, mostly a few nits about the new `buildEnvironment` method.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r18663278
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -38,6 +40,20 @@ object CommandUtils extends Logging {
           command.arguments
       }
     
    +  def buildEnvironment(command: Command): Map[String, String] = {
    +    val libraryPathEntries = command.libraryPathEntries
    +    if (libraryPathEntries.nonEmpty) {
    +      val libraryPath = Utils.libraryPath
    +      val pathSeparator = File.pathSeparator
    --- End diff --
    
    nit: looks like overkill to put this in a variable.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19492748
  
    --- Diff: yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientBase.scala ---
    @@ -307,10 +308,9 @@ private[spark] trait ClientBase extends Logging {
         val localResources = prepareLocalResources(appStagingDir)
         val launchEnv = setupLaunchEnv(appStagingDir)
         val amContainer = Records.newRecord(classOf[ContainerLaunchContext])
    -    amContainer.setLocalResources(localResources)
    -    amContainer.setEnvironment(launchEnv)
     
         val javaOpts = ListBuffer[String]()
    +    val prefixEnv = ListBuffer[String]()
    --- End diff --
    
    It seems that `prefixEnv` will always be a list with 1 element. Let's just make it a string, then down there in L390 we can do
    ```
    val commands = Seq(prefixEnv) ++ Seq(...)
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-60125412
  
    @witgo is this still a WIP?
    
    @andrewor14 @pwendell @tgravescs some eyes on this would be greatly appreciated. Thanks!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58909681
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/21690/
    Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH i...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-61022785
  
    Hey @witgo the latest changes LGTM. I left two more minor comments. Have you had a chance to quickly test the latest code to make sure you haven't broken anything in the later commits?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19254384
  
    --- Diff: core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala ---
    @@ -157,9 +158,9 @@ private[spark] class CoarseMesosSchedulerBackend(
           // glob the directory "correctly".
           val basename = uri.split('/').last.split('.').head
           command.setValue(
    -        ("cd %s*; " +
    +        ("cd %s*; %s " +
               "./bin/spark-class org.apache.spark.executor.CoarseGrainedExecutorBackend %s %s %s %d %s")
    -          .format(basename, driverUrl, offer.getSlaveId.getValue,
    +          .format(basename, prefixEnv, driverUrl, offer.getSlaveId.getValue,
    --- End diff --
    
    Why do you need to set the environment this way? Why can't you just add it to the command's environment the same way other places have been doing it (i.e. `environment.addVariables`)?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19255674
  
    --- Diff: yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientBase.scala ---
    @@ -385,7 +390,7 @@ private[spark] trait ClientBase extends Logging {
             "--num-executors ", args.numExecutors.toString)
     
         // Command for the ApplicationMaster
    -    val commands = Seq(Environment.JAVA_HOME.$() + "/bin/java", "-server") ++
    +    val commands = prefixEnv ++ Seq(Environment.JAVA_HOME.$() + "/bin/java", "-server") ++
    --- End diff --
    
    If we use this method, LD_LIBRARY_PATH is assigned to string `/some/path:$LD_LIBRARY_PATH`,  `/some/path:$LD_LIBRARY_PATH` is not correctly parsed into ` /some/path:/usr/local/lib`. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19491885
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -1684,6 +1689,34 @@ private[spark] object Utils extends Logging {
         method.invoke(obj, values.toSeq: _*)
       }
     
    +  /**
    +   * Return the current system LD_LIBRARY_PATH name
    +   */
    +  def libraryPathName: String = {
    --- End diff --
    
    This is really `libraryPathEnvName`


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH i...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-61131038
  
    Great, thanks guys!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-60778169
  
    Jenkins, retest this please.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r18663356
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -38,6 +40,20 @@ object CommandUtils extends Logging {
           command.arguments
       }
     
    +  def buildEnvironment(command: Command): Map[String, String] = {
    +    val libraryPathEntries = command.libraryPathEntries
    +    if (libraryPathEntries.nonEmpty) {
    +      val libraryPath = Utils.libraryPath
    +      val pathSeparator = File.pathSeparator
    +      val sysLibraryPath = sys.env.get(libraryPath)
    +      val cmdLibraryPath = command.environment.get(libraryPath)
    +      val libraryPaths = libraryPathEntries ++ cmdLibraryPath ++ sysLibraryPath
    +      command.environment + ((Utils.libraryPath, libraryPaths.mkString(pathSeparator)))
    --- End diff --
    
    Since you pulled `Utils.libraryPath` into a variable above, either use that, or just use `Utils.libraryPath` everywhere (which I'd prefer since it's not really that verbose).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH i...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19516036
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -1684,6 +1689,34 @@ private[spark] object Utils extends Logging {
         method.invoke(obj, values.toSeq: _*)
       }
     
    +  /**
    +   * Return the current system LD_LIBRARY_PATH name
    +   */
    +  def libraryPathName: String = {
    +    if (isWindows) {
    +      "PATH"
    +    } else if (isMac) {
    +      "DYLD_LIBRARY_PATH"
    +    } else {
    +      "LD_LIBRARY_PATH"
    +    }
    +  }
    +
    +  def prefixLibraryPath(libraryPaths: Seq[String]): String = {
    +    val libraryPathScriptVar = if (isWindows) {
    +      s"%${libraryPathName}%"
    +    } else {
    +      "$" + libraryPathName
    +    }
    +    val libraryPath = (libraryPaths :+ libraryPathScriptVar).mkString(File.pathSeparator)
    +    val ampersand = if (Utils.isWindows) {
    +      " &"
    --- End diff --
    
    Sorry, the right should be 
    http://stackoverflow.com/questions/8922224/multiple-commands-on-a-single-line-in-a-windows-batch-file


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58368003
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21462/consoleFull) for   PR 2711 at commit [`033201d`](https://github.com/apache/spark/commit/033201d6a7f9c2d48f67613715013386313783b6).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-60626712
  
    On the Unix side I think the changes are ok, but I don't know how Yarn works on Windows as far as launching containers, nor do I have any Windows box where to find that out... so it would be nice to at least test it's doing the right thing.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH i...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-61049248
  
    Alright thanks, I'm merging this into master!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58388729
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21467/consoleFull) for   PR 2711 at commit [`4cc826c`](https://github.com/apache/spark/commit/4cc826c3c5aed92af4a27f022499e9d0d522bf99).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58489209
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21532/consoleFull) for   PR 2711 at commit [`880e06d`](https://github.com/apache/spark/commit/880e06de3b4022f8113251e43c3e28ab2d38a35e).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH i...

Posted by asfgit <gi...@git.apache.org>.
Github user asfgit closed the pull request at:

    https://github.com/apache/spark/pull/2711


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19253416
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala ---
    @@ -130,6 +121,14 @@ private[spark] object SparkSubmitDriverBootstrapper {
         // Start the driver JVM
         val filteredCommand = command.filter(_.nonEmpty)
         val builder = new ProcessBuilder(filteredCommand)
    +    val env = builder.environment()
    +
    +    // SPARK_SUBMIT_LIBRARY_PATH is already captured in JAVA_OPTS
    --- End diff --
    
    This comment should be inside the `if` case


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58899791
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21691/consoleFull) for   PR 2711 at commit [`c18ff6a`](https://github.com/apache/spark/commit/c18ff6a3be99f52776736890233beaa0e86a770a).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-60179112
  
    Hey @witgo did you have a chance to test this on the various environments this affect? In particular, have you verified whether extra library paths on the executors are working on a real yarn cluster? Also, it would be good to test this on all of Windows, OSX and linux to make sure we haven't broken anything.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19491199
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -38,6 +40,31 @@ object CommandUtils extends Logging {
           command.arguments
       }
     
    +  def buildLocalCommand(
    +    command: Command,
    +    substituteArguments: String => String,
    +    classPath: Seq[String] = Seq[String](),
    +    env: Map[String, String] = sys.env): Command = {
    +    val libraryPathName = Utils.libraryPathName
    +    val libraryPathEntries = command.libraryPathEntries
    +    val cmdLibraryPath = command.environment.get(libraryPathName)
    +
    +    val newEnvironment = if (libraryPathEntries.nonEmpty && libraryPathName.nonEmpty) {
    +      val libraryPaths = libraryPathEntries ++ cmdLibraryPath ++ env.get(libraryPathName)
    +      command.environment + ((libraryPathName, libraryPaths.mkString(File.pathSeparator)))
    +    } else {
    +      command.environment
    +    }
    +
    +    Command(
    +      command.mainClass,
    +      command.arguments.map(substituteArguments),
    +      newEnvironment,
    +      command.classPathEntries ++ classPath,
    +      Seq[String](),
    --- End diff --
    
    I would add a comment on the side
    ```
      ...
      Seq[String](), // library path already captured in environment variable
      command.javaOpts)
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-60192493
  
    @andrewor14 
    I've tested in linux (yarn clusters) and Mac OS X(standalone).
    There is no test in Windows and Mesos(probably test these in this weekend).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19492780
  
    --- Diff: yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ExecutorRunnableUtil.scala ---
    @@ -47,6 +48,7 @@ trait ExecutorRunnableUtil extends Logging {
           localResources: HashMap[String, LocalResource]): List[String] = {
         // Extra options for the JVM
         val javaOpts = ListBuffer[String]()
    +    val prefixEnv = ListBuffer[String]()
    --- End diff --
    
    same here


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH i...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-61050455
  
    Very good, thanks, @andrewor14 @vanzin 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r18664237
  
    --- Diff: yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ExecutorRunnableUtil.scala ---
    @@ -58,6 +61,10 @@ trait ExecutorRunnableUtil extends Logging {
         sys.env.get("SPARK_JAVA_OPTS").foreach { opts =>
           javaOpts += opts
         }
    +    sys.props.get("spark.executor.extraLibraryPath").foreach { p =>
    +      val libraryPath = Seq(p, Utils.libraryPathScriptVar).mkString(File.pathSeparator)
    --- End diff --
    
    I see this logic to buld `LD_LIBRARY_PATH=foo` in at least 3 different places, and they all look very similar if not identical. Probably time to put this in a helper method somewhere.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58368014
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/21462/Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19492867
  
    --- Diff: yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientBase.scala ---
    @@ -393,6 +396,8 @@ private[spark] trait ClientBase extends Logging {
         // TODO: it would be nicer to just make sure there are no null commands here
         val printableCommands = commands.map(s => if (s == null) "null" else s).toList
         amContainer.setCommands(printableCommands)
    +    amContainer.setLocalResources(localResources)
    +    amContainer.setEnvironment(launchEnv)
    --- End diff --
    
    Any reason to move these here? I'm fine with keeping these here but I'm just curious if this is functionally different in any way.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r18664025
  
    --- Diff: core/src/test/scala/org/apache/spark/deploy/CommandUtilsSuite.scala ---
    @@ -0,0 +1,34 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.deploy
    +
    +import org.scalatest.FunSuite
    +import org.apache.spark.deploy.worker.CommandUtils
    +import org.apache.spark.util.Utils
    +import org.scalatest.Matchers
    --- End diff --
    
    nit: merge with FunSuite import.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58486257
  
      [QA tests have finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21530/consoleFull) for   PR 2711 at commit [`a69bca1`](https://github.com/apache/spark/commit/a69bca1ee44329f1d05694baa44d9e902ed640ef).
     * This patch **passes all tests**.
     * This patch merges cleanly.
     * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58530530
  
    Jenkins, retest this please. 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19491555
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -38,6 +40,31 @@ object CommandUtils extends Logging {
           command.arguments
       }
     
    +  def buildLocalCommand(
    --- End diff --
    
    Actually, should this just replace `buildCommandSeq`? I think it makes little sense to have both `buildLocalCommand` and that. It's really confusing which one you should use. I would just move everything from `buildCommandSeq` into here, then make `buildJavaOpts` private.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r18598609
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -1502,6 +1507,25 @@ private[spark] object Utils extends Logging {
         PropertyConfigurator.configure(pro)
       }
     
    +  /**
    +   * Return the current system LD_LIBRARY_PATH environment
    +   */
    +  def libraryPath: String = if (isWindows) {
    +    "PATH"
    +  } else if (isMac) {
    +    "DYLD_LIBRARY_PATH"
    +  } else {
    +    "LD_LIBRARY_PATH"
    +  }
    +
    +  def libraryPathScriptVar: String = {
    +    if (Utils.isWindows) {
    +      s"%${Utils.libraryPath}%"
    --- End diff --
    
    nit: `Utils.` is redundant here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58572809
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r18598713
  
    --- Diff: core/src/test/scala/org/apache/spark/deploy/CommandUtilsSuite.scala ---
    @@ -0,0 +1,33 @@
    +/*
    + * Licensed to the Apache Software Foundation (ASF) under one or more
    + * contributor license agreements.  See the NOTICE file distributed with
    + * this work for additional information regarding copyright ownership.
    + * The ASF licenses this file to You under the Apache License, Version 2.0
    + * (the "License"); you may not use this file except in compliance with
    + * the License.  You may obtain a copy of the License at
    + *
    + *    http://www.apache.org/licenses/LICENSE-2.0
    + *
    + * Unless required by applicable law or agreed to in writing, software
    + * distributed under the License is distributed on an "AS IS" BASIS,
    + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
    + * See the License for the specific language governing permissions and
    + * limitations under the License.
    + */
    +
    +package org.apache.spark.deploy
    +
    +import org.scalatest.FunSuite
    +import org.apache.spark.deploy.worker.CommandUtils
    +import org.apache.spark.util.Utils
    +import org.scalatest.Matchers
    +
    +class CommandUtilsSuite extends FunSuite with Matchers {
    +
    +  test("set java.library.path correctly") {
    +    val sparkHome = sys.props.getOrElse("spark.test.home", fail("spark.test.home is not set!"))
    +    val cmd = CommandUtils.buildCommandSeq(new Command("mainClass", Seq(), Map(), Seq(),
    +      Seq("libraryPathToB"), Seq()), 512, sparkHome)
    +    assert(cmd.head.contains("libraryPathToB"))
    --- End diff --
    
    nit: `assert` generally gives not-very-useful error messages. How about:
    
        cmd.head should contain ("libraryPathToB")


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19492395
  
    --- Diff: core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala ---
    @@ -120,16 +121,16 @@ private[spark] class CoarseMesosSchedulerBackend(
           environment.addVariables(
             Environment.Variable.newBuilder().setName("SPARK_CLASSPATH").setValue(cp).build())
         }
    -    val extraJavaOpts = conf.getOption("spark.executor.extraJavaOptions")
    +    val extraJavaOpts = conf.get("spark.executor.extraJavaOptions", "")
     
    -    val libraryPathOption = "spark.executor.extraLibraryPath"
    -    val extraLibraryPath = conf.getOption(libraryPathOption).map(p => s"-Djava.library.path=$p")
    -    val extraOpts = Seq(extraJavaOpts, extraLibraryPath).flatten.mkString(" ")
    +    val prefixEnv = conf.getOption("spark.executor.extraLibraryPath").map { p =>
    +      Utils.prefixLibraryPath(Seq(p))
    +    }.getOrElse("")
    --- End diff --
    
    I would add a comment here that explains why we have to set the environment this way:
    ```
    // Set the environment variable through the prefix because we need to preserve the existing value
    val prefixEnv = ...
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19253448
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -38,6 +40,19 @@ object CommandUtils extends Logging {
           command.arguments
       }
     
    +  def buildEnvironment(command: Command,
    +    env: Map[String, String] = sys.env): Map[String, String] = {
    --- End diff --
    
    ```
    def buildEnvironment(
        command: Command,
        env: Map[String, String] = sys.env): Map[String, String] = {
      ...
    }
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19492325
  
    --- Diff: core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala ---
    @@ -120,16 +121,16 @@ private[spark] class CoarseMesosSchedulerBackend(
           environment.addVariables(
             Environment.Variable.newBuilder().setName("SPARK_CLASSPATH").setValue(cp).build())
         }
    -    val extraJavaOpts = conf.getOption("spark.executor.extraJavaOptions")
    +    val extraJavaOpts = conf.get("spark.executor.extraJavaOptions", "")
     
    -    val libraryPathOption = "spark.executor.extraLibraryPath"
    -    val extraLibraryPath = conf.getOption(libraryPathOption).map(p => s"-Djava.library.path=$p")
    -    val extraOpts = Seq(extraJavaOpts, extraLibraryPath).flatten.mkString(" ")
    +    val prefixEnv = conf.getOption("spark.executor.extraLibraryPath").map { p =>
    --- End diff --
    
    I would add a comment here that explains why we have to set the environment this way:
    ```
    // Set the environment variable through the prefix because we need to preserve the existing value
    val prefixEnv = ...
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r18663528
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -38,6 +40,20 @@ object CommandUtils extends Logging {
           command.arguments
       }
     
    +  def buildEnvironment(command: Command): Map[String, String] = {
    +    val libraryPathEntries = command.libraryPathEntries
    +    if (libraryPathEntries.nonEmpty) {
    +      val libraryPath = Utils.libraryPath
    +      val pathSeparator = File.pathSeparator
    +      val sysLibraryPath = sys.env.get(libraryPath)
    --- End diff --
    
    Hmmm... I wonder if this should be done. The way it is, the code creating the `Command` instance can choose the command's environment (e.g., it can set it to `sys.env` to inherit the environment or can create something completely different). Here you're forcing the caller's environment onto the child `Command`...


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH i...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-60945796
  
    retest this please


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19490665
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmitDriverBootstrapper.scala ---
    @@ -130,6 +121,14 @@ private[spark] object SparkSubmitDriverBootstrapper {
         // Start the driver JVM
         val filteredCommand = command.filter(_.nonEmpty)
         val builder = new ProcessBuilder(filteredCommand)
    +    val env = builder.environment()
    +
    +    if (submitLibraryPath.isEmpty && confLibraryPath.nonEmpty) {
    +      // SPARK_SUBMIT_LIBRARY_PATH is already captured in JAVA_OPTS
    --- End diff --
    
    Wait, sorry actually this comment doesn't make any sense anymore. It doesn't even belong here.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19471157
  
    --- Diff: bin/spark-class ---
    @@ -81,7 +81,11 @@ case "$1" in
         OUR_JAVA_OPTS="$SPARK_JAVA_OPTS $SPARK_SUBMIT_OPTS"
         OUR_JAVA_MEM=${SPARK_DRIVER_MEMORY:-$DEFAULT_MEM}
         if [ -n "$SPARK_SUBMIT_LIBRARY_PATH" ]; then
    -      OUR_JAVA_OPTS="$OUR_JAVA_OPTS -Djava.library.path=$SPARK_SUBMIT_LIBRARY_PATH"
    +      if [[ $OSTYPE == darwin* ]]; then
    --- End diff --
    
    In the Windows, we should use `bin/spark-class2.cmd`,Isn't it?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by andrewor14 <gi...@git.apache.org>.
Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19254039
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -38,6 +40,19 @@ object CommandUtils extends Logging {
           command.arguments
       }
     
    +  def buildEnvironment(command: Command,
    +    env: Map[String, String] = sys.env): Map[String, String] = {
    +    val libraryPathEntries = command.libraryPathEntries
    +    if (libraryPathEntries.nonEmpty) {
    +      val libraryPathName = Utils.libraryPathName
    +      val cmdLibraryPath = command.environment.get(libraryPathName)
    +      val libraryPaths = libraryPathEntries ++ cmdLibraryPath ++ env.get(libraryPathName)
    +      command.environment + ((libraryPathName, libraryPaths.mkString(File.pathSeparator)))
    +    } else {
    +      command.environment
    --- End diff --
    
    Don't we also want to add `env.get(libraryPathName)` here?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by SparkQA <gi...@git.apache.org>.
Github user SparkQA commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58531176
  
      [QA tests have started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/21539/consoleFull) for   PR 2711 at commit [`604a37b`](https://github.com/apache/spark/commit/604a37b6172620de505578025790e2f72eb845f2).
     * This patch merges cleanly.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-60755440
  
    @andrewor14  @vanzin 
    Because I don't have a Windows server, the relevant tests may need a little time.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [SPARK-1720][SPARK-1719] use LD_LIBRARY_PATH i...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-61043113
  
    @andrewor14 
    I've tested in Linux (yarn, mesos) and Mac OS X(standalone).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-60791390
  
    @witgo I'm not super worried about Windows and, personally, I'd be ok with leaving that for a separate bug, but that's me. :-) Let's see what Andrew (or others) think.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19479453
  
    --- Diff: core/src/main/scala/org/apache/spark/util/Utils.scala ---
    @@ -1684,6 +1689,34 @@ private[spark] object Utils extends Logging {
         method.invoke(obj, values.toSeq: _*)
       }
     
    +  /**
    +   * Return the current system LD_LIBRARY_PATH name
    +   */
    +  def libraryPathName: String = {
    +    if (isWindows) {
    +      "PATH"
    +    } else if (isMac) {
    +      "DYLD_LIBRARY_PATH"
    +    } else {
    +      "LD_LIBRARY_PATH"
    +    }
    +  }
    +
    +  def prefixLibraryPath(libraryPaths: Seq[String]): String = {
    +    val libraryPathScriptVar = if (isWindows) {
    +      s"%${libraryPathName}%"
    +    } else {
    +      "$" + libraryPathName
    +    }
    +    val libraryPath = (libraryPaths :+ libraryPathScriptVar).mkString(File.pathSeparator)
    +    val ampersand = if (Utils.isWindows) {
    +      " &"
    --- End diff --
    
    Reference here: 
    http://stackoverflow.com/questions/21439457/windows-batch-file-multiple-conditions-and-lines?rq=1


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58388735
  
    Test PASSed.
    Refer to this link for build results (access rights to CI server needed): 
    https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/21467/Test PASSed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by vanzin <gi...@git.apache.org>.
Github user vanzin commented on the pull request:

    https://github.com/apache/spark/pull/2711#issuecomment-58396693
  
    @witgo this looks ok, but it does have the problem of assuming all processes involved in the job are running the same OS. Granted, I think this is a problem in other places in Spark too, so maybe a more concerted effort to fix that should be started separately.
    
    Also, you probably want to take a look at `SparkSubmitDriverBootstrapper.scala` and also fix it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request: [WIP][SPARK-1720][SPARK-1719] use LD_LIBRARY_P...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2711#discussion_r19258226
  
    --- Diff: core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala ---
    @@ -38,6 +40,19 @@ object CommandUtils extends Logging {
           command.arguments
       }
     
    +  def buildEnvironment(command: Command,
    +    env: Map[String, String] = sys.env): Map[String, String] = {
    +    val libraryPathEntries = command.libraryPathEntries
    +    if (libraryPathEntries.nonEmpty) {
    +      val libraryPathName = Utils.libraryPathName
    +      val cmdLibraryPath = command.environment.get(libraryPathName)
    +      val libraryPaths = libraryPathEntries ++ cmdLibraryPath ++ env.get(libraryPathName)
    +      command.environment + ((libraryPathName, libraryPaths.mkString(File.pathSeparator)))
    +    } else {
    +      command.environment
    +    }
    +  }
    +
    --- End diff --
    
    The command instance is created on different machines. Environment variable cannot be used directly.
    
    We can try to build a local Command instances:
    ```
      def buildLocalCommand(
        command: Command,
        substituteArguments: String => String,
        classPath: Seq[String] = Seq[String](),
        env: Map[String, String] = sys.env): Command = {
        val libraryPathEntries = command.libraryPathEntries
        val newEnvironment = if (libraryPathEntries.nonEmpty) {
          val libraryPathName = Utils.libraryPathName
          val cmdLibraryPath = command.environment.get(libraryPathName)
          val libraryPaths = libraryPathEntries ++ cmdLibraryPath ++ env.get(libraryPathName)
          command.environment + ((libraryPathName, libraryPaths.mkString(File.pathSeparator)))
        } else {
          command.environment
        }
        Command(
          command.mainClass,
          command.arguments.map(substituteArguments),
          newEnvironment,
          command.classPathEntries ++ classPath,
          Seq[String](),
          command.javaOpts)
      }
    ```


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org