You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Imran Rashid (JIRA)" <ji...@apache.org> on 2018/08/09 16:04:00 UTC

[jira] [Updated] (SPARK-25078) Standalone does not work with spark.authenticate.secret and deploy-mode=cluster

     [ https://issues.apache.org/jira/browse/SPARK-25078?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Imran Rashid updated SPARK-25078:
---------------------------------
    Summary: Standalone does not work with spark.authenticate.secret and deploy-mode=cluster  (was: Standalone cluster mode does not work with spark.authenticate.secret)

> Standalone does not work with spark.authenticate.secret and deploy-mode=cluster
> -------------------------------------------------------------------------------
>
>                 Key: SPARK-25078
>                 URL: https://issues.apache.org/jira/browse/SPARK-25078
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>    Affects Versions: 2.4.0
>            Reporter: Imran Rashid
>            Priority: Major
>
> When running a spark standalone cluster with spark.authenticate.secret setup, you cannot submit a program in cluster mode, even with the right secret.  The driver fails with:
> {noformat}
> 18/08/09 08:17:21 INFO SecurityManager: SecurityManager: authentication enabled; ui acls disabled; users  with view permissions: Set(systest); groups with view permissions: Set(); users  with modify permissions: Set(systest); groups with modify permissions: Set()
> 18/08/09 08:17:21 ERROR SparkContext: Error initializing SparkContext.
> java.lang.IllegalArgumentException: requirement failed: A secret key must be specified via the spark.authenticate.secret config.
>         at scala.Predef$.require(Predef.scala:224)
>         at org.apache.spark.SecurityManager.initializeAuth(SecurityManager.scala:361)
>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:238)
>         at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:175)
>         at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:257)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:424)
> ...
> {noformat}
> but its actually doing the wrong check in {{SecurityManager.initializeAuth()}}.  The secret is there, its just in an environment variable {{_SPARK_AUTH_SECRET}} (so its not visible to another process).
> *Workaround*: In your program, you can pass in a dummy secret to your spark conf.  It doesn't matter what it is at all, later it'll be ignored and when establishing connections, the secret from the env variable will be used.  Eg.
> {noformat}
> val conf = new SparkConf()
> conf.setIfMissing("spark.authenticate.secret", "doesn't matter")
> val sc = new SparkContext(conf)
> {noformat}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org