You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Mateusz Kaczor (JIRA)" <ji...@apache.org> on 2019/03/20 08:47:00 UTC

[jira] [Commented] (SPARK-26606) parameters passed in extraJavaOptions are not being picked up

    [ https://issues.apache.org/jira/browse/SPARK-26606?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16796924#comment-16796924 ] 

Mateusz Kaczor commented on SPARK-26606:
----------------------------------------

I think I've come across similar issue (at least with the driver, works fine for executors in my case)

I've even posted a question on stackoverflow: [https://stackoverflow.com/questions/55244273/spark-2-4-0-submit-in-cluster-mode-why-is-rest-submission-server-required]

 

To sum up the problem:

Spark version 2.4.0, *standalone* cluster.

I'm submitting app using spark-submit, in all cases exactly the same script is used, just changing master port and deploy mode.

I want to pass some extraJavaOptions to driver hence I'm using spark.driver.extraJavaOptions property (-- conf "spark.driver.extraJavaOptions=-Dfoo=BAR")

I assume that variable was properly passed if it's listed in System Properties table in Environment tab of app UI (the one running on port 4040).

Here is what I've observed:

 
||Deploy mode||Deploy to 7077 (regular way)||Deploy to 6066 (via REST)||
|Client|Variables are passed correctly|N/A|
|Cluster|*{color:#ff0000}Variables are not passed{color}*|Variables are passed correctly|

 

All in all, it looks to me that if we want to pass system variables in cluster mode *we have to* deploy via REST.

I consider it a bug, please correct me if I'm wrong. 

 

> parameters passed in extraJavaOptions are not being picked up 
> --------------------------------------------------------------
>
>                 Key: SPARK-26606
>                 URL: https://issues.apache.org/jira/browse/SPARK-26606
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 2.3.1
>            Reporter: Ravindra
>            Priority: Major
>              Labels: java, spark
>
> driver.extraJavaOptions and executor.extraJavaOptions are not being picked up . Even though I see the parameters are being passed fine in the spark launch command I do not see these parameters are being picked up for some unknown reason. My source code throws an error stating the java params are empty
>  
> This is my spark submit command: 
>     output=`spark-submit \
>  --class com.demo.myApp.App \
>  --conf 'spark.executor.extraJavaOptions=-Dapp.env=dev -Dapp.country=US -Dapp.banner=ABC -Doracle.net.tns_admin=/work/artifacts/oracle/current -Djava.security.egd=[file:/dev/./urandom|file:///dev/urandom]' \
>  --conf 'spark.driver.extraJavaOptions=-Dapp.env=dev -Dapp.country=US -Dapp.banner=ABC -Doracle.net.tns_admin=/work/artifacts/oracle/current -Djava.security.egd=[file:/dev/./urandom|file:///dev/urandom]' \
>  --executor-memory "$EXECUTOR_MEMORY" \
>  --executor-cores "$EXECUTOR_CORES" \
>  --total-executor-cores "$TOTAL_CORES" \
>  --driver-memory "$DRIVER_MEMORY" \
>  --deploy-mode cluster \
>  /home/spark/asm//current/myapp-*.jar 2>&1 &`
>  
>  
> Is there any other way I can access the java params with out using extraJavaOptions. 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org