You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Adam Binford (Jira)" <ji...@apache.org> on 2022/08/27 17:45:00 UTC

[jira] [Created] (SPARK-40246) Logging isn't configurable via log4j2 with hadoop-provided profile

Adam Binford created SPARK-40246:
------------------------------------

             Summary: Logging isn't configurable via log4j2 with hadoop-provided profile
                 Key: SPARK-40246
                 URL: https://issues.apache.org/jira/browse/SPARK-40246
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 3.3.0
            Reporter: Adam Binford


When building Spark with -Phadoop-provided (or using the 3.3.0 build without Hadoop), there is no slf implementation provided for log4j2, so the default log4j2 properties are ignored and logging isn't configurable via SparkContext.setLogLevel.

Reproduction on a fresh Ubuntu container:

 
{noformat}
apt-get update
apt-get install -y wget
wget https://dlcdn.apache.org/hadoop/common/hadoop-3.3.4/hadoop-3.3.4.tar.gz
wget https://dlcdn.apache.org/spark/spark-3.3.0/spark-3.3.0-bin-without-hadoop.tgz
tar -xvf hadoop-3.3.4.tar.gz -C /opt
tar -xvf spark-3.3.0-bin-without-hadoop.tgz -C /opt
export HADOOP_HOME=/opt/hadoop-3.3.4/
export SPARK_HOME=/opt/spark-3.3.0-bin-without-hadoop/
apt install -y openjdk-11-jre-headless python3
export JAVA_HOME=/usr/lib/jvm/java-11-openjdk-amd64/
export SPARK_DIST_CLASSPATH=$($HADOOP_HOME/bin/hadoop classpath)
$SPARK_HOME/bin/pyspark
{noformat}
The default log level starts at INFO and you can't change it with sc.setLogLevel



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org