You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/08/10 13:39:00 UTC
[jira] [Resolved] (SPARK-32554) Update the k8s document according
to the current development status
[ https://issues.apache.org/jira/browse/SPARK-32554?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun resolved SPARK-32554.
-----------------------------------
Fix Version/s: 3.1.0
Resolution: Fixed
Issue resolved by pull request 29368
[https://github.com/apache/spark/pull/29368]
> Update the k8s document according to the current development status
> -------------------------------------------------------------------
>
> Key: SPARK-32554
> URL: https://issues.apache.org/jira/browse/SPARK-32554
> Project: Spark
> Issue Type: Improvement
> Components: Documentation, Kubernetes
> Affects Versions: 3.0.1
> Reporter: Takeshi Yamamuro
> Assignee: Takeshi Yamamuro
> Priority: Minor
> Fix For: 3.1.0
>
>
> To make users understood more correctly about the current development status of the k8s scheduler, this ticket targets at updating the k8s document in the primary branch/branch-3.0;
> BEFORE:
> {code:java}
> The Kubernetes scheduler is currently experimental. In future versions, there may be behavioral changes around
> configuration, container images and entrypoints.{code}
> AFTER:
> {code:java}
> The Kubernetes scheduler is currently experimental. The most basic parts are getting stable, but Dynamic
> Resource Allocation and External Shuffle Service need to be available before we officially announce GA for it.{code}
> This comes from a thread in the spark-dev mailing list: [http://apache-spark-developers-list.1001551.n3.nabble.com/spark-on-k8s-is-still-experimental-td29942.html]
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org