You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by dawn breaks <20...@gmail.com> on 2019/02/13 08:21:54 UTC

Got fatal error when running spark 2.4.0 on k8s

we submit spark job to k8s by the following command, and the driver pod got
an error and exit. Anybody can help us to solve it?

 ./bin/spark-submit \
    --master k8s://https://172.21.91.48:6443 \
    --deploy-mode cluster \
    --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark \
    --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=1 \
    --conf spark.kubernetes.container.image=xxxRepo/spark:v2.4.0 \
    local:///opt/spark/examples/jars/spark-examples*.jar \
    5


The error detail info as following:

2019-02-13 07:13:06 ERROR SparkContext:91 - Error initializing SparkContext.
org.apache.spark.SparkException: External scheduler cannot be instantiated
        at
org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2794)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:493)
        at
org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
        at
org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
        at
org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
        at scala.Option.getOrElse(Option.scala:121)
        at
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org
<http://org.apache.spark.deploy.sparksubmit.org/>
$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
        at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at
org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: io.fabric8.kubernetes.client.KubernetesClientException: An error
has occurred.
        at
io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:62)
        at
io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:53)
        at
io.fabric8.kubernetes.client.utils.HttpClientUtils.createHttpClient(HttpClientUtils.java:167)
        at
org.apache.spark.deploy.k8s.SparkKubernetesClientFactory$.createKubernetesClient(SparkKubernetesClientFactory.scala:84)
        at
org.apache.spark.scheduler.cluster.k8s.KubernetesClusterManager.createSchedulerBackend(KubernetesClusterManager.scala:64)
        at
org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2788)
        ... 20 more
Caused by: java.security.cert.CertificateException: Could not parse
certificate: java.io.IOException: Empty input
        at
sun.security.provider.X509Factory.engineGenerateCertificate(X509Factory.java:110)
        at
java.security.cert.CertificateFactory.generateCertificate(CertificateFactory.java:339)
        at
io.fabric8.kubernetes.client.internal.CertUtils.createTrustStore(CertUtils.java:93)
        at
io.fabric8.kubernetes.client.internal.CertUtils.createTrustStore(CertUtils.java:71)
        at
io.fabric8.kubernetes.client.internal.SSLUtils.trustManagers(SSLUtils.java:114)
        at
io.fabric8.kubernetes.client.internal.SSLUtils.trustManagers(SSLUtils.java:93)
        at
io.fabric8.kubernetes.client.utils.HttpClientUtils.createHttpClient(HttpClientUtils.java:63)
        ... 23 more
Caused by: java.io.IOException: Empty input
        at
sun.security.provider.X509Factory.engineGenerateCertificate(X509Factory.java:106)
        ... 29 more

Re: Got fatal error when running spark 2.4.0 on k8s

Posted by dawn breaks <20...@gmail.com>.
It seems that fabric8 kubernetes client can't parse the caCertFile in the
default location /var/run/secrets/kubernetes.io/serviceaccount/ca.crt,  and
anybody give me some advices?

On Wed, 13 Feb 2019 at 16:21, dawn breaks <20...@gmail.com> wrote:

> we submit spark job to k8s by the following command, and the driver pod
> got an error and exit. Anybody can help us to solve it?
>
>  ./bin/spark-submit \
>     --master k8s://https://172.21.91.48:6443 \
>     --deploy-mode cluster \
>     --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark \
>     --name spark-pi \
>     --class org.apache.spark.examples.SparkPi \
>     --conf spark.executor.instances=1 \
>     --conf spark.kubernetes.container.image=xxxRepo/spark:v2.4.0 \
>     local:///opt/spark/examples/jars/spark-examples*.jar \
>     5
>
>
> The error detail info as following:
>
> 2019-02-13 07:13:06 ERROR SparkContext:91 - Error initializing
> SparkContext.
> org.apache.spark.SparkException: External scheduler cannot be instantiated
>         at
> org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2794)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:493)
>         at
> org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
>         at
> org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
>         at
> org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
>         at scala.Option.getOrElse(Option.scala:121)
>         at
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
>         at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
>         at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at
> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>         at org.apache.spark.deploy.SparkSubmit.org
> <http://org.apache.spark.deploy.sparksubmit.org/>
> $apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
>         at
> org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
>         at
> org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
>         at
> org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
>         at
> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
>         at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
>         at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: io.fabric8.kubernetes.client.KubernetesClientException: An
> error has occurred.
>         at
> io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:62)
>         at
> io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:53)
>         at
> io.fabric8.kubernetes.client.utils.HttpClientUtils.createHttpClient(HttpClientUtils.java:167)
>         at
> org.apache.spark.deploy.k8s.SparkKubernetesClientFactory$.createKubernetesClient(SparkKubernetesClientFactory.scala:84)
>         at
> org.apache.spark.scheduler.cluster.k8s.KubernetesClusterManager.createSchedulerBackend(KubernetesClusterManager.scala:64)
>         at
> org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2788)
>         ... 20 more
> Caused by: java.security.cert.CertificateException: Could not parse
> certificate: java.io.IOException: Empty input
>         at
> sun.security.provider.X509Factory.engineGenerateCertificate(X509Factory.java:110)
>         at
> java.security.cert.CertificateFactory.generateCertificate(CertificateFactory.java:339)
>         at
> io.fabric8.kubernetes.client.internal.CertUtils.createTrustStore(CertUtils.java:93)
>         at
> io.fabric8.kubernetes.client.internal.CertUtils.createTrustStore(CertUtils.java:71)
>         at
> io.fabric8.kubernetes.client.internal.SSLUtils.trustManagers(SSLUtils.java:114)
>         at
> io.fabric8.kubernetes.client.internal.SSLUtils.trustManagers(SSLUtils.java:93)
>         at
> io.fabric8.kubernetes.client.utils.HttpClientUtils.createHttpClient(HttpClientUtils.java:63)
>         ... 23 more
> Caused by: java.io.IOException: Empty input
>         at
> sun.security.provider.X509Factory.engineGenerateCertificate(X509Factory.java:106)
>         ... 29 more
>

RE: Got fatal error when running spark 2.4.0 on k8s

Posted by "Sinha, Breeta (Nokia - IN/Bangalore)" <br...@nokia.com>.
Hi Dawn,

Probably, you are providing the incorrect image(must be a java image) or the incorrect master ip or the service account. Please verify the pod’s permissions for the service account(‘spark’ in your case).

I have tried executing the same program as below:

./spark-submit     --master k8s://https://<masterIP:port>     --deploy-mode cluster        --name spark-pi     --class org.apache.spark.examples.SparkPi     --conf spark.executor.instances=1     --conf spark.kubernetes.container.image=<Image> --conf spark.kubernetes.namespace=<namespace>     local:///opt/spark/examples/jars/spark-examples*.jar 5

And, I was able to see “Pi is roughly 3.139774279548559” in the pod’s output log.

Hope this will help! 😊

Regards,
Breeta


From: dawn breaks <20...@gmail.com>
Sent: Wednesday, February 13, 2019 1:52 PM
To: user@spark.apache.org
Subject: Got fatal error when running spark 2.4.0 on k8s

we submit spark job to k8s by the following command, and the driver pod got an error and exit. Anybody can help us to solve it?

 ./bin/spark-submit \
    --master k8s://https://172.21.91.48:6443<https://172.21.91.48:6443/> \
    --deploy-mode cluster \
    --conf spark.kubernetes.authenticate.driver.serviceAccountName=spark \
    --name spark-pi \
    --class org.apache.spark.examples.SparkPi \
    --conf spark.executor.instances=1 \
    --conf spark.kubernetes.container.image=xxxRepo/spark:v2.4.0 \
    local:///opt/spark/examples/jars/spark-examples*.jar \
    5


The error detail info as following:

2019-02-13 07:13:06 ERROR SparkContext:91 - Error initializing SparkContext.
org.apache.spark.SparkException: External scheduler cannot be instantiated
        at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2794)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:493)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2520)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:935)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:926)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:926)
        at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
        at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org<http://org.apache.spark.deploy.sparksubmit.org/>$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: io.fabric8.kubernetes.client.KubernetesClientException: An error has occurred.
        at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:62)
        at io.fabric8.kubernetes.client.KubernetesClientException.launderThrowable(KubernetesClientException.java:53)
        at io.fabric8.kubernetes.client.utils.HttpClientUtils.createHttpClient(HttpClientUtils.java:167)
        at org.apache.spark.deploy.k8s.SparkKubernetesClientFactory$.createKubernetesClient(SparkKubernetesClientFactory.scala:84)
        at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterManager.createSchedulerBackend(KubernetesClusterManager.scala:64)
        at org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:2788)
        ... 20 more
Caused by: java.security.cert.CertificateException: Could not parse certificate: java.io.IOException: Empty input
        at sun.security.provider.X509Factory.engineGenerateCertificate(X509Factory.java:110)
        at java.security.cert.CertificateFactory.generateCertificate(CertificateFactory.java:339)
        at io.fabric8.kubernetes.client.internal.CertUtils.createTrustStore(CertUtils.java:93)
        at io.fabric8.kubernetes.client.internal.CertUtils.createTrustStore(CertUtils.java:71)
        at io.fabric8.kubernetes.client.internal.SSLUtils.trustManagers(SSLUtils.java:114)
        at io.fabric8.kubernetes.client.internal.SSLUtils.trustManagers(SSLUtils.java:93)
        at io.fabric8.kubernetes.client.utils.HttpClientUtils.createHttpClient(HttpClientUtils.java:63)
        ... 23 more
Caused by: java.io.IOException: Empty input
        at sun.security.provider.X509Factory.engineGenerateCertificate(X509Factory.java:106)
        ... 29 more