You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by SamyaMaiti <sa...@gmail.com> on 2016/07/21 17:10:46 UTC

spark.driver.extraJavaOptions

Hi Team,

I am using *CDH 5.7.1* with spark *1.6.0*

I have a spark streaming application that read s from kafka & do some
processing.

The issue is while starting the application in CLUSTER mode, i want to pass
custom log4j.properies file to both driver & executor.

*I have the below command :-*

spark-submit \
--class xyx.search.spark.Boot \
--conf "spark.cores.max=6" \
--conf "spark.eventLog.enabled=true" \
*--conf
"spark.driver.extraJavaOptions=-Dlog4j.configuration=file:/some/path/search-spark-service-log4j-Driver.properties"
\
--conf
"spark.executor.extraJavaOptions=-Dlog4j.configuration=file:/some/path/search-spark-service-log4j-Executor.properties"
\*
--deploy-mode "cluster" \
/some/path/search-spark-service-1.0.0.jar \
/some/path/conf/


*But it gives the below exception :-*

SPARK_JAVA_OPTS was detected (set to
'-XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh ').
This is deprecated in Spark 1.0+.

Please instead use:
 - ./spark-submit with conf/spark-defaults.conf to set defaults for an
application
 - ./spark-submit with --driver-java-options to set -X options for a driver
 - spark.executor.extraJavaOptions to set -X options for executors
 - SPARK_DAEMON_JAVA_OPTS to set java options for standalone daemons (master
or worker)
        
2016-07-21 12:59:41 ERROR SparkContext:95 - Error initializing SparkContext.
org.apache.spark.SparkException: Found both spark.executor.extraJavaOptions
and SPARK_JAVA_OPTS. Use only the former.
	at
org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$5.apply(SparkConf.scala:470)
	at
org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$5.apply(SparkConf.scala:468)
	at scala.collection.immutable.List.foreach(List.scala:318)
	at
org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:468)
	at
org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:454)


/*Please note the same works with CDH 5.4 with spark 1.3.0.*/

Regards,
Sam



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-driver-extraJavaOptions-tp27389.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: spark.driver.extraJavaOptions

Posted by dhruve ashar <dh...@gmail.com>.
I am not familiar with the CDH distributions. However from the exception,
you are setting both SPARK_JAVA_OPTS and specifying individually for driver
and executor.

Check for the spark-env.sh file in your spark config directory and you
could comment/remove  the SPARK_JAVA_OPTS entry and add the values to
required driver and executor java options.

On Thu, Jul 21, 2016 at 12:10 PM, SamyaMaiti <sa...@gmail.com>
wrote:

> Hi Team,
>
> I am using *CDH 5.7.1* with spark *1.6.0*
>
> I have a spark streaming application that read s from kafka & do some
> processing.
>
> The issue is while starting the application in CLUSTER mode, i want to pass
> custom log4j.properies file to both driver & executor.
>
> *I have the below command :-*
>
> spark-submit \
> --class xyx.search.spark.Boot \
> --conf "spark.cores.max=6" \
> --conf "spark.eventLog.enabled=true" \
> *--conf
>
> "spark.driver.extraJavaOptions=-Dlog4j.configuration=file:/some/path/search-spark-service-log4j-Driver.properties"
> \
> --conf
>
> "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:/some/path/search-spark-service-log4j-Executor.properties"
> \*
> --deploy-mode "cluster" \
> /some/path/search-spark-service-1.0.0.jar \
> /some/path/conf/
>
>
> *But it gives the below exception :-*
>
> SPARK_JAVA_OPTS was detected (set to
> '-XX:OnOutOfMemoryError=/usr/lib64/cmf/service/common/killparent.sh ').
> This is deprecated in Spark 1.0+.
>
> Please instead use:
>  - ./spark-submit with conf/spark-defaults.conf to set defaults for an
> application
>  - ./spark-submit with --driver-java-options to set -X options for a driver
>  - spark.executor.extraJavaOptions to set -X options for executors
>  - SPARK_DAEMON_JAVA_OPTS to set java options for standalone daemons
> (master
> or worker)
>
> 2016-07-21 12:59:41 ERROR SparkContext:95 - Error initializing
> SparkContext.
> org.apache.spark.SparkException: Found both spark.executor.extraJavaOptions
> and SPARK_JAVA_OPTS. Use only the former.
>         at
>
> org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$5.apply(SparkConf.scala:470)
>         at
>
> org.apache.spark.SparkConf$$anonfun$validateSettings$6$$anonfun$apply$5.apply(SparkConf.scala:468)
>         at scala.collection.immutable.List.foreach(List.scala:318)
>         at
>
> org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:468)
>         at
>
> org.apache.spark.SparkConf$$anonfun$validateSettings$6.apply(SparkConf.scala:454)
>
>
> /*Please note the same works with CDH 5.4 with spark 1.3.0.*/
>
> Regards,
> Sam
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/spark-driver-extraJavaOptions-tp27389.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>


-- 
-Dhruve Ashar

Re: spark.driver.extraJavaOptions

Posted by SamyaMaiti <sa...@gmail.com>.
Thanks for the reply RK.

Using the first option, my application doesn't recognize
spark.driver.extraJavaOptions. 

With the second option, the issue remains as same,

2016-07-21 12:59:41 ERROR SparkContext:95 - Error initializing SparkContext. 
org.apache.spark.SparkException: Found both spark.executor.extraJavaOptions
and SPARK_JAVA_OPTS. Use only the former. 

Looks like either of the two issue :-
1. Some where in my cluster SPARK_JAVA_OPTS is getting set, but i have done
a details review of my cluster, no where i am exporting this value.
2. There is some issue with this specific version of CDH (CDH 5.7.1 + spark
1.6.0)

-Sam



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-driver-extraJavaOptions-tp27389p27392.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: spark.driver.extraJavaOptions

Posted by RK Aduri <rk...@collectivei.com>.
This has worked for me:
--conf "spark.driver.extraJavaOptions
-Dlog4j.configuration=file:/some/path/search-spark-service-log4j-Driver.properties"
\ 

you may want to try it.

If that doesn't work, then you may use --properties-file



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/spark-driver-extraJavaOptions-tp27389p27391.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org