You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by Pashugan <gi...@git.apache.org> on 2017/02/13 13:04:09 UTC

[GitHub] spark pull request #16913: [SPARK-15531] [DEPLOY] Complement launcher JVM me...

GitHub user Pashugan opened a pull request:

    https://github.com/apache/spark/pull/16913

    [SPARK-15531] [DEPLOY] Complement launcher JVM memory settings to avoid the conflict with system-wide settings

    ## What changes were proposed in this pull request?
    
    It is well-known that having a lot of memory on a server JVM tries to allocate a huge chunk of it for the heap by default when it starts.
    
    I'm using the JAVA_TOOL_OPTIONS environment variable to override default JVM settings and set an adequate system-wide heap size. When command line options merge with JAVA_TOOL_OPTIONS I'm getting the following error because my initial heap size (1g for instance) exceeds the maximum heap size (128m) provided in the code:
    ```
    Error occurred during initialization of VM
    Incompatible minimum and maximum heap sizes specified
    ```
    I debugged the launcher scripts and, since there is no visible way to fix this outside the code, my proposal is to provide an initial heap size in the code too.
    
    ## How was this patch tested?
    
    Jenkins tests.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/Pashugan/spark fix_java_options_conflict

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/16913.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #16913
    
----
commit be6deabcf30b2ceda475e31d45626e211af7d195
Author: Pavel Knoblokh <pa...@users.noreply.github.com>
Date:   2017-02-13T08:13:04Z

    Complement launcher JVM memory settings to avoid the conflict with system-wide settings

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16913: [SPARK-15531] [DEPLOY] Complement launcher JVM memory se...

Posted by AmplabJenkins <gi...@git.apache.org>.
Github user AmplabJenkins commented on the issue:

    https://github.com/apache/spark/pull/16913
  
    Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16913: [SPARK-15531] [DEPLOY] Complement launcher JVM memory se...

Posted by Pashugan <gi...@git.apache.org>.
Github user Pashugan commented on the issue:

    https://github.com/apache/spark/pull/16913
  
    There must be some misunderstanding. May I have a chance you have a look at my micro-patch because it has nothing to do with the driver and its options. In fact, it fixes the call of the "launcher library", which is used to fill up a bash array, which is in turn used to run the actual driver. Then, my above explanations should hopefully become as clear as day. :)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16913: [SPARK-15531] [DEPLOY] Complement launcher JVM memory se...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/16913
  
    You should not be using JAVA_TOOL_OPTIONS. Set Spark's JVM properties directly with spark.driver.extraJavaOptions, for example.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16913: [SPARK-15531] [DEPLOY] Complement launcher JVM memory se...

Posted by Pashugan <gi...@git.apache.org>.
Github user Pashugan commented on the issue:

    https://github.com/apache/spark/pull/16913
  
    JAVA_TOOL_OPTIONS doesn't take precedence over command line options (in oppose to _JAVA_OPTIONS). Thus, JVM is trying to start with an initial heap size set in JAVA_TOOL_OPTIONS and a max heap size overridden by the spark-class script, which is currently 128m. If I set a system-wide initial heap size bigger than 128m I'll get a conflict with the max heap size option hardcoded in the Spark code, so it looks to me like a Spark problem.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16913: [SPARK-15531] [DEPLOY] Complement launcher JVM memory se...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/16913
  
    Well, you need to set a larger max heap too. I can't see how this is a Spark problem.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark pull request #16913: [SPARK-15531] [DEPLOY] Complement launcher JVM me...

Posted by Pashugan <gi...@git.apache.org>.
Github user Pashugan closed the pull request at:

    https://github.com/apache/spark/pull/16913


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


[GitHub] spark issue #16913: [SPARK-15531] [DEPLOY] Complement launcher JVM memory se...

Posted by srowen <gi...@git.apache.org>.
Github user srowen commented on the issue:

    https://github.com/apache/spark/pull/16913
  
    You should not set a global init heap size like this and this is pretty much exactly why. There is no need for the launcher to immediately requests that much memory otherwise. Please close this


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastructure@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org