You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gabor Somogyi (JIRA)" <ji...@apache.org> on 2019/05/29 12:51:00 UTC

[jira] [Commented] (SPARK-23472) Add config properties for administrator JVM options

    [ https://issues.apache.org/jira/browse/SPARK-23472?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16850821#comment-16850821 ] 

Gabor Somogyi commented on SPARK-23472:
---------------------------------------

I've picked this up and started to work on this.

> Add config properties for administrator JVM options
> ---------------------------------------------------
>
>                 Key: SPARK-23472
>                 URL: https://issues.apache.org/jira/browse/SPARK-23472
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 2.3.0
>            Reporter: Ryan Blue
>            Priority: Major
>
> In our environment, users may need to add JVM options to their Spark applications (e.g. to override log configuration). They typically use {{--driver-java-options}} or {{spark.executor.extraJavaOptions}}. Both set extraJavaOptions properties. We also have a set of administrator JVM options to apply that set the garbage collector (G1GC) and kill the driver JVM on OOM.
> These two use cases both need to set extraJavaOptions properties, but will clobber one another. In the past we've maintained wrapper scripts, but this causes our default properties to be maintained in scripts rather than our spark-defaults.properties.
> I think we should add defaultJavaOptions properties that are added along with extraJavaOptions. Administrators could set defaultJavaOptions and these would always get added to the JVM command line, along with any user options instead of getting overwritten by user options.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org