You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jonathan Lafleche (Jira)" <ji...@apache.org> on 2020/09/28 07:33:00 UTC

[jira] [Updated] (SPARK-33012) Upgrade fabric8 to 4.10.3 to support k8s 1.18.0

     [ https://issues.apache.org/jira/browse/SPARK-33012?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Jonathan Lafleche updated SPARK-33012:
--------------------------------------
    Description: 
According to [fabric8's compatibility matrix|https://github.com/fabric8io/kubernetes-client#compatibility-matrix], the current version (4.9.2) is not compatible with k8s 1.18.0.


In practice, we have not encountered any issues running spark against k8s 1.18.0, but it seems reasonable to track fabric8's declared compatibility.

  was:
According to [fabric8's compatibility matrix|[https://github.com/fabric8io/kubernetes-client#compatibility-matrix]], the current version (4.9.2) is not compatible with k8s 1.18.0.


In practice, we have not encountered any issues running spark against k8s 1.18.0, but it seems reasonable to track fabric8's declared compatibility.


> Upgrade fabric8 to 4.10.3 to support k8s 1.18.0
> -----------------------------------------------
>
>                 Key: SPARK-33012
>                 URL: https://issues.apache.org/jira/browse/SPARK-33012
>             Project: Spark
>          Issue Type: Improvement
>          Components: Kubernetes
>    Affects Versions: 3.0.1
>            Reporter: Jonathan Lafleche
>            Priority: Minor
>
> According to [fabric8's compatibility matrix|https://github.com/fabric8io/kubernetes-client#compatibility-matrix], the current version (4.9.2) is not compatible with k8s 1.18.0.
> In practice, we have not encountered any issues running spark against k8s 1.18.0, but it seems reasonable to track fabric8's declared compatibility.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org