You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "MINET J-Sébastien (JIRA)" <ji...@apache.org> on 2019/06/20 07:12:00 UTC

[jira] [Created] (SPARK-28119) Cannot read environment variable inside custom property file

MINET J-Sébastien created SPARK-28119:
-----------------------------------------

             Summary: Cannot read environment variable inside custom property file
                 Key: SPARK-28119
                 URL: https://issues.apache.org/jira/browse/SPARK-28119
             Project: Spark
          Issue Type: Bug
          Components: Input/Output
    Affects Versions: 2.4.3
         Environment: Linux ubuntu 16.10, spark standalone 2.4.3
            Reporter: MINET J-Sébastien


Spark is compiled with commons-configuration version 1.6 due to hadoop-client library dependency
{code:java}
[INFO] | +- org.apache.hadoop:hadoop-client:jar:2.6.5:provided
[INFO] | | +- org.apache.hadoop:hadoop-common:jar:2.6.5:provided
[INFO] | | | +- commons-cli:commons-cli:jar:1.2:provided
[INFO] | | | +- xmlenc:xmlenc:jar:0.52:provided
[INFO] | | | +- commons-httpclient:commons-httpclient:jar:3.1:provided
[INFO] | | | +- commons-io:commons-io:jar:2.4:provided
[INFO] | | | +- commons-collections:commons-collections:jar:3.2.2:provided
[INFO] | | | +- commons-configuration:commons-configuration:jar:1.6:provided
{code}
Here is my code
{code:java}
import org.apache.commons.configuration.ConfigurationException;
import org.apache.commons.configuration.PropertiesConfiguration;
import org.apache.spark.sql.SparkSession;

public class SparkPropertyTest {
   public static void main(String... args) throws ConfigurationException {
     SparkSession sp = SparkSession.builder().getOrCreate();
     PropertiesConfiguration config = new PropertiesConfiguration();
     String file = sp.sparkContext().getConf().get("spark.files");
     sp.log().warn("Using property file {}", file);
     config.load(file);
     sp.log().warn(config.getString("env.path"));
  }
}
{code}
Here is the added contains to *log4j.properties*
{code:java}
env.path=${env:PATH}
{code}
If I launch spark job with following vm options
{code:java}
-Dspark.master=local[2] -Dspark.files=src/main/resources/log4j.properties
{code}
I get the result where the environement variable is printed as is
{code:java}
2019-06-20 07:09:03 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-06-20 07:09:05 WARN SparkSession:11 - Using property file src/main/resources/log4j.properties
2019-06-20 07:09:05 WARN SparkSession:13 - ${env:PATH}
{code}
Now I update my pom.xml
{code:java}
<dependency>
   <groupId>commons-configuration</groupId>
   <artifactId>commons-configuration</artifactId>
   <version>1.10</version>
</dependency>
{code}
So the new result is
{code:java}
2019-06-20 07:09:40 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2019-06-20 07:09:42 WARN SparkSession:11 - Using property file src/main/resources/log4j.properties
2019-06-20 07:09:42 WARN SparkSession:13 - /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
{code}
 

 

 

 

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org