You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Agostino Calamita <ag...@gmail.com> on 2018/05/02 10:59:38 UTC

spark.executor.extraJavaOptions inside application code

Hi all,
I wrote an application that needs an environment variable. I can set this
variable with

--conf 'spark.executor.extraJavaOptions=-Dbasicauth=myuser:mypwd'

in spark-submit and it works well in standalone cluster mode.

But, I want set it inside the application code, because the variable
contains a password.

How can I do ?

I tried with:

    SparkSession spark = SparkSession
                  .builder()
                  .appName("Java Spark Solr ETL")
                  .getOrCreate();


spark.sparkContext().conf().setExecutorEnv("spark.executor.extraJavaOptions",
"-Dbasicauth=myuser:mypassword");

but it doesn't work.

Thanks.

Re: spark.executor.extraJavaOptions inside application code

Posted by Vadim Semenov <va...@datadoghq.com>.
You need to pass config before creating a session

val conf = new SparkConf()
// All three methods below are equivalent
conf.set("spark.executor.extraJavaOptions", "-Dbasicauth=myuser:mypassword")
conf.set("spark.executorEnv.basicauth", "myuser:mypassword")
conf.setExecutorEnv("basicauth", "myuser:mypassword")
val spark = SparkSession.builder().config(conf).appName("…").getOrCreate()


On Wed, May 2, 2018 at 6:59 AM, Agostino Calamita <
agostino.calamita@gmail.com> wrote:

> Hi all,
> I wrote an application that needs an environment variable. I can set this
> variable with
>
> --conf 'spark.executor.extraJavaOptions=-Dbasicauth=myuser:mypwd'
>
> in spark-submit and it works well in standalone cluster mode.
>
> But, I want set it inside the application code, because the variable
> contains a password.
>
> How can I do ?
>
> I tried with:
>
>     SparkSession spark = SparkSession
>                   .builder()
>                   .appName("Java Spark Solr ETL")
>                   .getOrCreate();
>
>         spark.sparkContext().conf().setExecutorEnv("spark.executor.extraJavaOptions",
> "-Dbasicauth=myuser:mypassword");
>
> but it doesn't work.
>
> Thanks.
>



-- 
Sent from my iPhone