You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2019/01/11 19:05:00 UTC
[jira] [Commented] (SPARK-26606) extraJavaOptions does not work
[ https://issues.apache.org/jira/browse/SPARK-26606?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16740668#comment-16740668 ]
Marcelo Vanzin commented on SPARK-26606:
----------------------------------------
Define not working?
e.g. I see you have both single quotes and variable references. That won't work the way you're probably expecting it to.
> extraJavaOptions does not work
> -------------------------------
>
> Key: SPARK-26606
> URL: https://issues.apache.org/jira/browse/SPARK-26606
> Project: Spark
> Issue Type: Bug
> Components: Spark Submit
> Affects Versions: 2.3.1
> Reporter: Ravindra
> Priority: Major
> Labels: java, spark
> Attachments: Screen Shot 2019-01-09 at 4.31.01 PM.png
>
>
> driver.extraJavaOptions and executor.extraJavaOptions does not work in spark submit command. Even though I see the parameters are being passed fine in the spark launch command I do not see these parameters are being picked up for some unknown reason.
>
> This is my spark submit command:
> output=`spark-submit \
> --class com.demo.myApp.App \
> --conf 'spark.executor.extraJavaOptions=-Dapp.env=$ENV -Dapp.country=$COUNTRY -Dapp.banner=$BANNER -Doracle.net.tns_admin=/work/artifacts/oracle/current -Djava.security.egd=file:/dev/./urandom' \
> --conf 'spark.driver.extraJavaOptions=-Dapp.env=$ENV -Dapp.country=$COUNTRY -Dapp.banner=$BANNER -Doracle.net.tns_admin=/work/artifacts/oracle/current -Djava.security.egd=file:/dev/./urandom' \
> --executor-memory "$EXECUTOR_MEMORY" \
> --executor-cores "$EXECUTOR_CORES" \
> --total-executor-cores "$TOTAL_CORES" \
> --driver-memory "$DRIVER_MEMORY" \
> --deploy-mode cluster \
> /home/spark/asm//current/myapp-*.jar 2>&1 &`
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org