You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "aaronHadoop (Jira)" <ji...@apache.org> on 2021/11/28 00:39:00 UTC

[jira] [Comment Edited] (SPARK-33340) spark run on kubernetes has Could not load KUBERNETES classes issue

    [ https://issues.apache.org/jira/browse/SPARK-33340?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17449948#comment-17449948 ] 

aaronHadoop edited comment on SPARK-33340 at 11/28/21, 12:38 AM:
-----------------------------------------------------------------

I use spark 3.2.0, and it run successfully.

I solve and record this question in this document ([https://yujianxin.blog.csdn.net/article/details/121586848])


was (Author: aaronhadoop):
I use spark 3.2.0, and it run successfully. I recorded it in this document (https://yujianxin.blog.csdn.net/article/details/121586848)

> spark run on kubernetes has Could not load KUBERNETES classes issue
> -------------------------------------------------------------------
>
>                 Key: SPARK-33340
>                 URL: https://issues.apache.org/jira/browse/SPARK-33340
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 3.0.1
>         Environment: Kubernete 1.16
> Spark (master branch code)
>            Reporter: Xiu Juan Xiang
>            Priority: Major
>
> Hi, I am trying to run spark on my kubernetes cluster (it's not a minikube cluster). And I follow this doc: [https://spark.apache.org/docs/latest/running-on-kubernetes.html] to create spark docker image and then submit the application step by step. However, it failed and I check the log of spark driver, it showed below error:
> ```+ exec /usr/bin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=172.30.140.13 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.deploy.PythonRunner file:/root/Work/spark/examples/src/main/python/wordcount.py+ exec /usr/bin/tini -s -- /opt/spark/bin/spark-submit --conf spark.driver.bindAddress=172.30.140.13 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.apache.spark.deploy.PythonRunner file:/root/Work/spark/examples/src/main/python/wordcount.pyException in thread "main" org.apache.spark.SparkException: Could not load KUBERNETES classes. This copy of Spark may not have been compiled with KUBERNETES support. at org.apache.spark.deploy.SparkSubmit.error(SparkSubmit.scala:942) at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:265) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:877) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1013) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1022) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> ```
> I am not sure if I am missing which step. I have been blocked here several days. Cloud you please help me about this? Thanks in advance!
>  
> By the way, below is the step I did:
>  # Prepare a kubernetes cluster and check I have appropriate permissions to list, create, edit and delete pods;
> About this, I am sure, I have all necessary permissions.
>  # Build distribution
> ```
> ./dev/make-distribution.sh --name custom-spark --pip --r --tgz -Psparkr -Phive -Phive-thriftserver -Pmesos -Pyarn -Pkubernetes
> ```
>  # Build spark docker image
> ```
> ./bin/docker-image-tool.sh spark -t latest build
> ```
>  # submit application 
> ```
> ./bin/spark-submit --master k8s://https://c7.us-south.containers.cloud.ibm.com:31937 --deploy-mode cluster --name spark-pi --class org.apache.spark.examples.SparkPi --conf spark.executor.instances=5 --conf spark.kubernetes.container.image=docker.io/bluebosh/spark:python3 examples/src/main/python/wordcount.py
> ```
> BTW, I am sure the master is correct and also my docker image has contained python.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org