You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "nouha bergaoui (Jira)" <ji...@apache.org> on 2019/11/04 17:35:00 UTC
[jira] [Created] (SPARK-29745) Couldn't submit Spark application
to Kubenetes in versions 1.15.3/1.15.4/1.15.5
nouha bergaoui created SPARK-29745:
--------------------------------------
Summary: Couldn't submit Spark application to Kubenetes in versions 1.15.3/1.15.4/1.15.5
Key: SPARK-29745
URL: https://issues.apache.org/jira/browse/SPARK-29745
Project: Spark
Issue Type: Bug
Components: Kubernetes
Affects Versions: 2.4.4, 2.4.3
Reporter: nouha bergaoui
Spark-submit ( cluster mode on Kubernetes ) results systematically into an error with the following log with the spark-pi application sample
The same call works with the 1.15.2 of Kubernetes ( and earlier )
( this lib seems to be responsible for the bug ; [https://github.com/fabric8io/kubernetes-client] )
log4j:WARN No appenders could be found for logger (io.fabric8.kubernetes.client.Config).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See [http://logging.apache.org/log4j/1.2/faq.html#noconfig] for more info.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Exception in thread "main" io.fabric8.kubernetes.client.KubernetesClientException: Failed to start websocket
at {color:#FF0000}*io.fabric8.kubernetes.client.dsl.internal.WatchConnectionManager$2.onFailure(WatchConnectionManager.java:207)*{color}
at okhttp3.internal.ws.RealWebSocket.failWebSocket(RealWebSocket.java:543)
at okhttp3.internal.ws.RealWebSocket$2.onFailure(RealWebSocket.java:208)
at okhttp3.RealCall$AsyncCall.execute(RealCall.java:148)
at okhttp3.internal.NamedRunnable.run(NamedRunnable.java:32)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org