You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Zach (Jira)" <ji...@apache.org> on 2022/03/01 21:55:00 UTC

[jira] [Commented] (SPARK-38380) Adding a demo/walkthrough section Running Spark on Kubernetes

    [ https://issues.apache.org/jira/browse/SPARK-38380?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17499785#comment-17499785 ] 

Zach commented on SPARK-38380:
------------------------------

If it helps for discussion purposes, I'm happy to stage a draft PR with my idea and link it here. 

> Adding a demo/walkthrough section Running Spark on Kubernetes
> -------------------------------------------------------------
>
>                 Key: SPARK-38380
>                 URL: https://issues.apache.org/jira/browse/SPARK-38380
>             Project: Spark
>          Issue Type: Improvement
>          Components: Documentation
>    Affects Versions: 3.2.1
>            Reporter: Zach
>            Priority: Minor
>
> I propose adding a section to [Running Spark on Kubernetes - Spark 3.2.1 Documentation (apache.org)|https://spark.apache.org/docs/latest/running-on-kubernetes.html#configuration] that walks a user through the 'happy path' of:
>  # creating and configuring a cluster
>  # preparing an example spark job
>  # adding the JAR to the container image
>  # submitting the job to the cluster using spark-submit
>  # getting the results
> The current guide covers a lot of this in the abstract, but I have to a lot of searching if I'm trying to walk through setting this up on Kubernetes the first time. I feel this would significantly improve the guide.
> The first section can be extensible for local demo cluster (minikube, kind) as well as cloud providers (amazon, google, azure).



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org