You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Dave Jaffe <dj...@vmware.com.INVALID> on 2019/06/11 01:15:14 UTC

Spark on Kubernetes - log4j.properties not read

I am using Spark on Kubernetes from Spark 2.4.3. I have created a log4j.properties file in my local spark/conf directory and modified it so that the console (or, in the case of Kubernetes, the log) only shows warnings and higher (log4j.rootCategory=WARN, console). I then added the command
COPY conf /opt/spark/conf
to /root/spark/kubernetes/dockerfiles/spark/Dockerfile and built a new container.

However, when I run that under Kubernetes, the program runs successfully but /opt/spark/conf/log4j.properties is not used (I still see the INFO lines when I run kubectl logs <driver pod>).

I have tried other things such as explicitly adding a –properties-file to my spark-submit command and even
--conf spark.driver.extraJavaOptions=-Dlog4j.configuration=file:///opt/spark/conf/log4j.properties

My log4j.properties file is never seen.

How do I customize log4j.properties with Kubernetes?

Thanks, Dave Jaffe


RE: Spark on Kubernetes - log4j.properties not read

Posted by Dave Jaffe <dj...@vmware.com>.
That did the trick, Abhishek! Thanks for the explanation, that answered a lot
of questions I had.

Dave



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


RE: Spark on Kubernetes - log4j.properties not read

Posted by "Rao, Abhishek (Nokia - IN/Bangalore)" <ab...@nokia.com>.
Hi Dave,

As part of driver pod bringup, a configmap is created using all the spark configuration parameters (with name spark.properties) and mounted to /opt/spark/conf. So all the other files present in /opt/spark/conf will be overwritten.
Same is happening with the log4j.properties in this case. You could try to build the container by placing the log4j.properties at some other location and set the same in spark.driver.extraJavaOptions

Thanks and Regards,
Abhishek

From: Dave Jaffe <dj...@vmware.com.INVALID>
Sent: Tuesday, June 11, 2019 6:45 AM
To: user@spark.apache.org
Subject: Spark on Kubernetes - log4j.properties not read

I am using Spark on Kubernetes from Spark 2.4.3. I have created a log4j.properties file in my local spark/conf directory and modified it so that the console (or, in the case of Kubernetes, the log) only shows warnings and higher (log4j.rootCategory=WARN, console). I then added the command
COPY conf /opt/spark/conf
to /root/spark/kubernetes/dockerfiles/spark/Dockerfile and built a new container.

However, when I run that under Kubernetes, the program runs successfully but /opt/spark/conf/log4j.properties is not used (I still see the INFO lines when I run kubectl logs <driver pod>).

I have tried other things such as explicitly adding a –properties-file to my spark-submit command and even
--conf spark.driver.extraJavaOptions=-Dlog4j.configuration=file:///opt/spark/conf/log4j.properties

My log4j.properties file is never seen.

How do I customize log4j.properties with Kubernetes?

Thanks, Dave Jaffe