You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Qiyuan Gong (Jira)" <ji...@apache.org> on 2022/06/17 13:42:00 UTC

[jira] [Commented] (SPARK-33332) Errors in running spark job on K8 with RPC Authentication Secret File properties

    [ https://issues.apache.org/jira/browse/SPARK-33332?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17555616#comment-17555616 ] 

Qiyuan Gong commented on SPARK-33332:
-------------------------------------

[~surbhi04] Thanks for raising this issue and providing hotfix. It helps. :)

 

Root cause:

After checking executor SparkConf, I found that only Driver have "spark.master" in SparkConf. That means [https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SecurityManager.scala#L360] will return none in executor. It causes executor always gets none from secretKeyFromFile.

 

This bug bypasses unit test check, because in [https://github.com/apache/spark/blob/master/core/src/test/scala/org/apache/spark/SecurityManagerSuite.scala#L473] "spark.master" is always provided.

 

Another hotfix without writing env is just remove "sparkConf.getOption(SparkLauncher.SPARK_MASTER)" and case in [https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SecurityManager.scala#L360.|https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/SecurityManager.scala#L360]
{code:java}
    sparkConf.get(authSecretFileConf).map { secretFilePath =>
      val secretFile = new File(secretFilePath)
      logDebug("K8s secretFilePath:" + secretFilePath)
      require(secretFile.isFile, s"No file found containing the secret key at $secretFilePath.")
      val base64Key = Base64.getEncoder.encodeToString(Files.readAllBytes(secretFile.toPath))
      require(base64Key.nonEmpty, s"Secret key from file located at $secretFilePath is empty.")
      base64Key
    }{code}
Hope these information can help.

> Errors in running spark job on K8 with RPC Authentication Secret File properties
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-33332
>                 URL: https://issues.apache.org/jira/browse/SPARK-33332
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 3.0.0
>            Reporter: Surbhi Aggarwal
>            Priority: Major
>
> I am running a spark job on Kubernetes with RPC authentication feature. First i am mounting a K8 secret to both driver and executor to a path and providing the same path for Auth configuration. Below is the spark-conf i am passing related to RPC Auth.
> {code:java}
> --conf spark.authenticate=true
> --conf spark.kubernetes.driver.secrets.spark-secret-sa=/tmp/secrets
> --conf spark.kubernetes.executor.secrets.spark-secret-sa=/tmp/secrets
> --conf spark.authenticate.secret.driver.file=/tmp/secrets/token
> --conf spark.authenticate.secret.executor.file=/tmp/secrets/token
> {code}
>  
> Mounting of secret is successfully done on both driver and executor pods and i can see the token file written to the containers. Driver is able to load the secret as well from the file but executor pods are failing to load the secret from file, and fails with the below exception:
> {code:java}
> java.lang.IllegalArgumentException: A secret key must be specified via the spark.authenticate.secret config
>  at org.apache.spark.SecurityManager.$anonfun$getSecretKey$6(SecurityManager.scala:298)
>  at scala.Option.getOrElse(Option.scala:189)
>  at org.apache.spark.SecurityManager.getSecretKey(SecurityManager.scala:297){code}
>  
> The value of the spark.authenticate.secret.executor.file property is being passed in SPARK_JAVA_OPS
> SPARK_JAVA_OPT_8: -Dspark.authenticate.secret.driver.file=/tmp/secrets/token
>  SPARK_JAVA_OPT_9: -Dspark.authenticate.secret.executor.file=/tmp/secrets/token
> I am suspecting that values from java options are not being read into the spark conf and ultimately leading to failure.
>  
> I am not sure if its a bug or a mis-configuration on my end. Amy help is greatly appreciated.
>  



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org