You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by witgo <gi...@git.apache.org> on 2014/05/05 12:28:11 UTC

[GitHub] spark pull request: Add missing description to spark-env.sh.templa...

GitHub user witgo opened a pull request:

    https://github.com/apache/spark/pull/646

    Add missing description to spark-env.sh.template

    

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/witgo/spark spark_env

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/646.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #646
    
----
commit 9a95a564593ad4071486abb51750cbf6c9b921ff
Author: witgo <wi...@qq.com>
Date:   2014-05-05T10:25:04Z

    Add missing description to spark-env.sh.template

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1756: Add missing description to spark-e...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/646#discussion_r12507600
  
    --- Diff: conf/spark-env.sh.template ---
    @@ -38,6 +38,7 @@
     # - SPARK_WORKER_INSTANCES, to set the number of worker processes per node
     # - SPARK_WORKER_DIR, to set the working directory of worker processes
     # - SPARK_WORKER_OPTS, to set config properties only for the worker (e.g. "-Dx=y")
    +# - SPARK_DRIVER_MEMORY, Memory for driver (e.g. 1000M, 2G) (Default: 512 Mb)
    --- End diff --
    
    Yes - does this work for you?:
    
    https://github.com/apache/spark/pull/730/files


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1756: Add missing description to spark-e...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on a diff in the pull request:

    https://github.com/apache/spark/pull/646#discussion_r12507619
  
    --- Diff: conf/spark-env.sh.template ---
    @@ -38,6 +38,7 @@
     # - SPARK_WORKER_INSTANCES, to set the number of worker processes per node
     # - SPARK_WORKER_DIR, to set the working directory of worker processes
     # - SPARK_WORKER_OPTS, to set config properties only for the worker (e.g. "-Dx=y")
    +# - SPARK_DRIVER_MEMORY, Memory for driver (e.g. 1000M, 2G) (Default: 512 Mb)
    --- End diff --
    
    Yes,it work for me. 
    `./bin/spark-shell --driver-memory 2g` =>
    ```
    /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -cp ::/Users/witgo/work/code/java/spark/dist/conf:/Users/witgo/work/code/java/spark/dist/lib/spark-assembly-1.0.0-SNAPSHOT-hadoop0.23.9.jar -Djava.library.path= -Xms2g -Xmx2g org.apache.spark.deploy.SparkSubmit spark-internal --driver-memory 2g --class org.apache.spark.repl.Main
    ```
    But  in  ` --driver-memory 2g --class org.apache.spark.repl.Main ` , `--driver-memory 2g` is unnecessary


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1756: Add missing description to spark-e...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on the pull request:

    https://github.com/apache/spark/pull/646#issuecomment-42763279
  
    A better solution [PR 730](https://github.com/apache/spark/pull/730)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1756: Add missing description to spark-e...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on a diff in the pull request:

    https://github.com/apache/spark/pull/646#discussion_r12507330
  
    --- Diff: conf/spark-env.sh.template ---
    @@ -38,6 +38,7 @@
     # - SPARK_WORKER_INSTANCES, to set the number of worker processes per node
     # - SPARK_WORKER_DIR, to set the working directory of worker processes
     # - SPARK_WORKER_OPTS, to set config properties only for the worker (e.g. "-Dx=y")
    +# - SPARK_DRIVER_MEMORY, Memory for driver (e.g. 1000M, 2G) (Default: 512 Mb)
    --- End diff --
    
    `./bin/spark-shell --driver-memory 2g` => 
    ```
    /System/Library/Java/JavaVirtualMachines/1.6.0.jdk/Contents/Home/bin/java -cp ::/Users/witgo/work/code/java/spark/dist/conf:/Users/witgo/work/code/java/spark/dist/lib/spark-assembly-1.0.0-SNAPSHOT-hadoop0.23.9.jar -Djava.library.path= -Xms512m -Xmx512m org.apache.spark.deploy.SparkSubmit spark-internal --driver-memory 2g --class org.apache.spark.repl.Main
    ```



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1756: Add missing description to spark-e...

Posted by witgo <gi...@git.apache.org>.
Github user witgo closed the pull request at:

    https://github.com/apache/spark/pull/646


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1756: Add missing description to spark-e...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/646#discussion_r12507317
  
    --- Diff: conf/spark-env.sh.template ---
    @@ -38,6 +38,7 @@
     # - SPARK_WORKER_INSTANCES, to set the number of worker processes per node
     # - SPARK_WORKER_DIR, to set the working directory of worker processes
     # - SPARK_WORKER_OPTS, to set config properties only for the worker (e.g. "-Dx=y")
    +# - SPARK_DRIVER_MEMORY, Memory for driver (e.g. 1000M, 2G) (Default: 512 Mb)
    --- End diff --
    
    I don't think we want to document this anymore. It's been subsumed by passing `--driver-memory` to the spark shell or spark-submit tools.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: Add missing description to spark-env.sh.templa...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the pull request:

    https://github.com/apache/spark/pull/646#issuecomment-42175763
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1756: Add missing description to spark-e...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/646#discussion_r12507319
  
    --- Diff: conf/spark-env.sh.template ---
    @@ -38,6 +38,7 @@
     # - SPARK_WORKER_INSTANCES, to set the number of worker processes per node
     # - SPARK_WORKER_DIR, to set the working directory of worker processes
     # - SPARK_WORKER_OPTS, to set config properties only for the worker (e.g. "-Dx=y")
    +# - SPARK_DRIVER_MEMORY, Memory for driver (e.g. 1000M, 2G) (Default: 512 Mb)
    --- End diff --
    
    Also this isn't in the correct section. But again, I don't think we want it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1756: Add missing description to spark-e...

Posted by witgo <gi...@git.apache.org>.
Github user witgo commented on a diff in the pull request:

    https://github.com/apache/spark/pull/646#discussion_r12507598
  
    --- Diff: conf/spark-env.sh.template ---
    @@ -38,6 +38,7 @@
     # - SPARK_WORKER_INSTANCES, to set the number of worker processes per node
     # - SPARK_WORKER_DIR, to set the working directory of worker processes
     # - SPARK_WORKER_OPTS, to set config properties only for the worker (e.g. "-Dx=y")
    +# - SPARK_DRIVER_MEMORY, Memory for driver (e.g. 1000M, 2G) (Default: 512 Mb)
    --- End diff --
    
    If so, 
    ```
    if [ ! -z $DRIVER_MEMORY ] && [ ! -z $DEPLOY_MODE ] && [ $DEPLOY_MODE = "client" ]; then
      export SPARK_MEM=$DRIVER_MEMORY
    fi
    ```  is not correct


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

[GitHub] spark pull request: SPARK-1756: Add missing description to spark-e...

Posted by pwendell <gi...@git.apache.org>.
Github user pwendell commented on a diff in the pull request:

    https://github.com/apache/spark/pull/646#discussion_r12507563
  
    --- Diff: conf/spark-env.sh.template ---
    @@ -38,6 +38,7 @@
     # - SPARK_WORKER_INSTANCES, to set the number of worker processes per node
     # - SPARK_WORKER_DIR, to set the working directory of worker processes
     # - SPARK_WORKER_OPTS, to set config properties only for the worker (e.g. "-Dx=y")
    +# - SPARK_DRIVER_MEMORY, Memory for driver (e.g. 1000M, 2G) (Default: 512 Mb)
    --- End diff --
    
    There is a big:
    https://github.com/apache/spark/blob/master/bin/spark-submit#L27
    
    This should say SPARK_DRIVER_MEMORY


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---