You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jagadeeswara Rao (Jira)" <ji...@apache.org> on 2020/08/17 14:56:00 UTC

[jira] [Commented] (SPARK-31800) Unable to disable Kerberos when submitting jobs to Kubernetes

    [ https://issues.apache.org/jira/browse/SPARK-31800?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17179044#comment-17179044 ] 

Jagadeeswara Rao commented on SPARK-31800:
------------------------------------------

[~drahkar] [~devaraj]

Looks like I am also running into same issue . I set the property *spark.kubernetes.file.upload.path* it moves further and failed to launch Driver Pod*.* Below are my observations. I am testing Spark Interactive session through Livy 
 # Livy Pod (Client) is getting successful by uploading jars/configs to the path *spark.kubernetes.file.upload.path.*
 # When launching Spark Driver it is looking for jars/configs under *spark.kubernetes.file.upload.path* but unable to locate failing .

*Livy uploading jars*

20/08/17 14:12:39 INFO LineBufferedStream: 20/08/17 14:12:39 INFO KubernetesUtils: Uploading file: /opt/livy/rsc-jars/asm-5.0.4.jar to dest: /tmp/spark-upload-f9017f71-aa73-4820-a2f5-e15fd1850b8f/asm-5.0.4.jar...

*Below are errors from driver pod.*

 

+ CMD=("$SPARK_HOME/bin/spark-submit" --conf "spark.driver.bindAddress=$SPARK_DRIVER_BIND_ADDRESS" --deploy-mode client "$@")
+ exec /usr/bin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=10.4.2.80 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.livy.rsc.driver.RSCDriverBootstrapper spark-internal
Setting spark.hadoop.yarn.resourcemanager.principal to livy
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
20/08/17 14:12:55 WARN DependencyUtils: Local jar /tmp/spark-upload-f9017f71-aa73-4820-a2f5-e15fd1850b8f/asm-5.0.4.jar does not exist, skipping.
20/08/17 14:12:55 WARN DependencyUtils: Local jar /tmp/spark-upload-66633035-69dd-4700-88ff-f3d300377025/livy-api-0.8.0-incubating-SNAPSHOT.jar does not exist, skipping.
20/08/17 14:12:55 WARN DependencyUtils: Local jar /tmp/spark-upload-5c31c92d-090b-4052-bc79-d15eeb88ea46/livy-rsc-0.8.0-incubating-SNAPSHOT.jar does not exist, skipping.
20/08/17 14:12:55 WARN DependencyUtils: Local jar /tmp/spark-upload-f2423ce3-97ba-4b08-b254-8d9ccfca74c5/minlog-1.3.0.jar does not exist, skipping.
20/08/17 14:12:55 WARN DependencyUtils: Local jar /tmp/spark-upload-3ee0f07c-2bd9-45a4-81d7-e6b2ee6d2ef4/netty-all-4.1.47.Final.jar does not exist, skipping.
20/08/17 14:12:55 WARN DependencyUtils: Local jar /tmp/spark-upload-2e5be186-40d8-496a-b068-634eb36e6b0d/objenesis-2.5.1.jar does not exist, skipping.
20/08/17 14:12:55 WARN DependencyUtils: Local jar /tmp/spark-upload-4743d490-be44-4b08-afcb-19e893135d38/reflectasm-1.11.3.jar does not exist, skipping.
20/08/17 14:12:55 WARN DependencyUtils: Local jar /tmp/spark-upload-0062da06-179b-437a-b052-b582785c21be/commons-codec-1.9.jar does not exist, skipping.
20/08/17 14:12:55 WARN DependencyUtils: Local jar /tmp/spark-upload-cb8ff9f5-9bad-41a4-a017-a2e61737b06f/livy-core_2.12-0.8.0-incubating-SNAPSHOT.jar does not exist, skipping.
20/08/17 14:12:55 WARN DependencyUtils: Local jar /tmp/spark-upload-96365437-5b71-49f5-b5ac-dc62d785dc6b/livy-repl_2.12-0.8.0-incubating-SNAPSHOT.jar does not exist, skipping.
Error: Failed to load class org.apache.livy.rsc.driver.RSCDriverBootstrapper

 

 

 

> Unable to disable Kerberos when submitting jobs to Kubernetes
> -------------------------------------------------------------
>
>                 Key: SPARK-31800
>                 URL: https://issues.apache.org/jira/browse/SPARK-31800
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 3.0.0
>            Reporter: James Boylan
>            Priority: Major
>
> When you attempt to submit a process to Kubernetes using spark-submit through --master, it returns the exception:
> {code:java}
> 20/05/22 20:25:54 INFO KerberosConfDriverFeatureStep: You have not specified a krb5.conf file locally or via a ConfigMap. Make sure that you have the krb5.conf locally on the driver image.
> Exception in thread "main" org.apache.spark.SparkException: Please specify spark.kubernetes.file.upload.path property.
>         at org.apache.spark.deploy.k8s.KubernetesUtils$.uploadFileUri(KubernetesUtils.scala:290)
>         at org.apache.spark.deploy.k8s.KubernetesUtils$.$anonfun$uploadAndTransformFileUris$1(KubernetesUtils.scala:246)
>         at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:238)
>         at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
>         at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
>         at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
>         at scala.collection.TraversableLike.map(TraversableLike.scala:238)
>         at scala.collection.TraversableLike.map$(TraversableLike.scala:231)
>         at scala.collection.AbstractTraversable.map(Traversable.scala:108)
>         at org.apache.spark.deploy.k8s.KubernetesUtils$.uploadAndTransformFileUris(KubernetesUtils.scala:245)
>         at org.apache.spark.deploy.k8s.features.BasicDriverFeatureStep.$anonfun$getAdditionalPodSystemProperties$1(BasicDriverFeatureStep.scala:165)
>         at scala.collection.immutable.List.foreach(List.scala:392)
>         at org.apache.spark.deploy.k8s.features.BasicDriverFeatureStep.getAdditionalPodSystemProperties(BasicDriverFeatureStep.scala:163)
>         at org.apache.spark.deploy.k8s.submit.KubernetesDriverBuilder.$anonfun$buildFromFeatures$3(KubernetesDriverBuilder.scala:60)
>         at scala.collection.LinearSeqOptimized.foldLeft(LinearSeqOptimized.scala:126)
>         at scala.collection.LinearSeqOptimized.foldLeft$(LinearSeqOptimized.scala:122)
>         at scala.collection.immutable.List.foldLeft(List.scala:89)
>         at org.apache.spark.deploy.k8s.submit.KubernetesDriverBuilder.buildFromFeatures(KubernetesDriverBuilder.scala:58)
>         at org.apache.spark.deploy.k8s.submit.Client.run(KubernetesClientApplication.scala:98)
>         at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.$anonfun$run$4(KubernetesClientApplication.scala:221)
>         at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.$anonfun$run$4$adapted(KubernetesClientApplication.scala:215)
>         at org.apache.spark.util.Utils$.tryWithResource(Utils.scala:2539)
>         at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:215)
>         at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:188)
>         at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
>         at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
>         at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
>         at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
>         at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> 20/05/22 20:25:54 INFO ShutdownHookManager: Shutdown hook called
> 20/05/22 20:25:54 INFO ShutdownHookManager: Deleting directory /private/var/folders/p1/y24myg413wx1l1l52bsdn2hr0000gq/T/spark-c94db9c5-b8a8-414d-b01d-f6369d31c9b8 {code}
> No changes in settings appear to be able to disable Kerberos. This is when running a simple execution of the SparkPi on our lab cluster. The command being used is
> {code:java}
> ./bin/spark-submit --master k8s://https://{api_hostname} --deploy-mode cluster --name spark-test --class org.apache.spark.examples.SparkPi --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark --conf spark.kubernetes.namespace=spark-jobs --conf spark.executor.instances=5 --conf spark.kubernetes.container.image={docker_registry}/spark:spark-3-test /opt/spark/examples/jars/spark-examples_2.12-3.0.0-preview2.jar{code}
> It is important to note that this same command, when run on Spark 2.4.5 works flawlessly once all of the RBAC accounts are properly setup in Kubernetes.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org