You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andreas Adamides (JIRA)" <ji...@apache.org> on 2019/03/05 13:13:00 UTC

[jira] [Updated] (SPARK-27059) spark-submit on kubernetes cluster does not recognise k8s --master property

     [ https://issues.apache.org/jira/browse/SPARK-27059?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Andreas Adamides updated SPARK-27059:
-------------------------------------
    Description: 
I have successfully installed a Kubernetes cluster and can verify this by:


{{C:\windows\system32>kubectl cluster-info }}
{{Kubernetes master is running at https://<ip>:<port> }}
{{KubeDNS is running at https://<ip>:<port>/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}

 

Trying to run the SparkPi with the Spark I downloaded from [https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)


{{spark-submit --master k8s://https://<ip>:<port> --deploy-mode cluster --name spark-pi --class org.apache.spark.examples.SparkPi --conf spark.executor.instances=2 --conf spark.kubernetes.container.image=gettyimages/spark c:\users\<username>\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}

 

I am getting this error:

 

{{Error: Master must either be yarn or start with spark, mesos, local Run with --help for usage help or --verbose for debug output}}

 

I also tried:

 

{{spark-submit --help}}

 

to see what I can get regarding the *--master* property. This is what I get:

 

{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}

 

According to the documentation [[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running Spark workloads in Kubernetes, spark-submit does not even seem to recognise the k8s value for master. [ included in possible Spark masters: [https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] ]

 

  was:
I have successfully installed a Kubernetes cluster and can verify this by:

 

 

{{C:\windows\system32>kubectl cluster-info Kubernetes master is running at https://<ip>:<port> KubeDNS is running at https://<ip>:<port>/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}

 

 

Then I am trying to run the SparkPi with the Spark I downloaded from [https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)

 

 

{{spark-submit --master k8s://https://<ip>:<port> --deploy-mode cluster --name spark-pi --class org.apache.spark.examples.SparkPi --conf spark.executor.instances=2 --conf spark.kubernetes.container.image=gettyimages/spark c:\users\<username>\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}

 

 

I am getting this error:

 

 

{{Error: Master must either be yarn or start with spark, mesos, local Run with --help for usage help or --verbose for debug output}}

 

 

I also tried:

 

 

{{spark-submit --help}}

 

 

to see what I can get regarding the *--master* property. This is what I get:

 

 

{{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}

 

 

According to the documentation [[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running Spark workloads in Kubernetes, spark-submit does not even seem to recognise the k8s value for master. [ included in possible Spark masters: [https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] ]

 


> spark-submit on kubernetes cluster does not recognise k8s --master property
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-27059
>                 URL: https://issues.apache.org/jira/browse/SPARK-27059
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 2.3.3, 2.4.0
>            Reporter: Andreas Adamides
>            Priority: Blocker
>
> I have successfully installed a Kubernetes cluster and can verify this by:
> {{C:\windows\system32>kubectl cluster-info }}
> {{Kubernetes master is running at https://<ip>:<port> }}
> {{KubeDNS is running at https://<ip>:<port>/api/v1/namespaces/kube-system/services/kube-dns:dns/proxy}}
>  
> Trying to run the SparkPi with the Spark I downloaded from [https://spark.apache.org/downloads.html] .(I tried versions 2.4.0 and 2.3.3)
> {{spark-submit --master k8s://https://<ip>:<port> --deploy-mode cluster --name spark-pi --class org.apache.spark.examples.SparkPi --conf spark.executor.instances=2 --conf spark.kubernetes.container.image=gettyimages/spark c:\users\<username>\Desktop\spark-2.4.0-bin-hadoop2.7\examples\jars\spark-examples_2.11-2.4.0.jar}}
>  
> I am getting this error:
>  
> {{Error: Master must either be yarn or start with spark, mesos, local Run with --help for usage help or --verbose for debug output}}
>  
> I also tried:
>  
> {{spark-submit --help}}
>  
> to see what I can get regarding the *--master* property. This is what I get:
>  
> {{--master MASTER_URL spark://host:port, mesos://host:port, yarn, or local.}}
>  
> According to the documentation [[https://spark.apache.org/docs/latest/running-on-kubernetes.html]] on running Spark workloads in Kubernetes, spark-submit does not even seem to recognise the k8s value for master. [ included in possible Spark masters: [https://spark.apache.org/docs/latest/submitting-applications.html#master-urls] ]
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org