You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andy Grove (Jira)" <ji...@apache.org> on 2019/09/01 15:03:00 UTC

[jira] [Resolved] (SPARK-28925) Update Kubernetes-client to 4.4.2 to be compatible with Kubernetes 1.13 and 1.14

     [ https://issues.apache.org/jira/browse/SPARK-28925?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Andy Grove resolved SPARK-28925.
--------------------------------
    Resolution: Duplicate

> Update Kubernetes-client to 4.4.2 to be compatible with Kubernetes 1.13 and 1.14
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-28925
>                 URL: https://issues.apache.org/jira/browse/SPARK-28925
>             Project: Spark
>          Issue Type: Bug
>          Components: Kubernetes
>    Affects Versions: 2.3.3, 2.4.3
>            Reporter: Eric
>            Priority: Minor
>
> Hello,
> If you use Spark with Kubernetes 1.13 or 1.14 you will see this error:
> {code:java}
> {"time": "2019-08-28T09:56:11.866Z", "lvl":"INFO", "logger": "org.apache.spark.internal.Logging", "thread":"kubernetes-executor-snapshots-subscribers-0","msg":"Going to request 1 executors from Kubernetes."}
> {"time": "2019-08-28T09:56:12.028Z", "lvl":"WARN", "logger": "io.fabric8.kubernetes.client.dsl.internal.WatchConnectionManager$2", "thread":"OkHttp https://kubernetes.default.svc/...","msg":"Exec Failure: HTTP 403, Status: 403 - "}
> java.net.ProtocolException: Expected HTTP 101 response but was '403 Forbidden'
> {code}
> Apparently the bug is fixed here: [https://github.com/fabric8io/kubernetes-client/pull/1669]
> We have currently compiled Spark source code with Kubernetes-client 4.4.2 and it's working great on our cluster. We are using Kubernetes 1.13.10.
>  
> Could it be possible to update that dependency version?
>  
> Thanks!



--
This message was sent by Atlassian Jira
(v8.3.2#803003)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org