You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by bo yang <bo...@gmail.com> on 2022/02/23 04:05:45 UTC

One click to run Spark on Kubernetes

Hi Spark Community,

We built an open source tool to deploy and run Spark on Kubernetes with a
one click command. For example, on AWS, it could automatically create an
EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
be able to use curl or a CLI tool to submit Spark application. After the
deployment, you could also install Uber Remote Shuffle Service to enable
Dynamic Allocation on Kuberentes.

Anyone interested in using or working together on such a tool?

Thanks,
Bo

Re: One click to run Spark on Kubernetes

Posted by bo yang <bo...@gmail.com>.
It uses Helm to deploy Spark Operator and Nginx. For other parts like
creating EKS, IAM role, node group, etc, it uses AWS SDK to provision those
AWS resources.

On Wed, Feb 23, 2022 at 11:28 AM Bjørn Jørgensen <bj...@gmail.com>
wrote:

> So if I get this right you will make a Helm <https://helm.sh> chart to
> deploy Spark and some other stuff on K8S?
>
> ons. 23. feb. 2022 kl. 17:49 skrev bo yang <bo...@gmail.com>:
>
>> Hi Sarath, let's follow up offline on this.
>>
>> On Wed, Feb 23, 2022 at 8:32 AM Sarath Annareddy <
>> sarath.annareddy@gmail.com> wrote:
>>
>>> Hi bo
>>>
>>> How do we start?
>>>
>>> Is there a plan? Onboarding, Arch/design diagram, tasks lined up etc
>>>
>>>
>>> Thanks
>>> Sarath
>>>
>>>
>>> Sent from my iPhone
>>>
>>> On Feb 23, 2022, at 10:27 AM, bo yang <bo...@gmail.com> wrote:
>>>
>>> 
>>> Hi Sarath, thanks for your interest and willing to contribute! The
>>> project supports local development using MiniKube. Similarly there is a one
>>> click command with one extra argument to deploy all components in MiniKube,
>>> and people could use that to develop on their local MacBook.
>>>
>>>
>>> On Wed, Feb 23, 2022 at 7:41 AM Sarath Annareddy <
>>> sarath.annareddy@gmail.com> wrote:
>>>
>>>> Hi bo
>>>>
>>>> I am interested to contribute.
>>>> But I don’t have free access to any cloud provider. Not sure how I can
>>>> get free access. I know Google, aws, azure only provides temp free access,
>>>> it may not be sufficient.
>>>>
>>>> Guidance is appreciated.
>>>>
>>>> Sarath
>>>>
>>>> Sent from my iPhone
>>>>
>>>> On Feb 23, 2022, at 2:01 AM, bo yang <bo...@gmail.com> wrote:
>>>>
>>>> 
>>>>
>>>> Right, normally people start with simple script, then add more stuff,
>>>> like permission and more components. After some time, people want to run
>>>> the script consistently in different environments. Things will become
>>>> complex.
>>>>
>>>> That is why we want to see whether people have interest for such a "one
>>>> click" tool to make things easy.
>>>>
>>>>
>>>> On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <
>>>> mich.talebzadeh@gmail.com> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> There are two distinct actions here; namely Deploy and Run.
>>>>>
>>>>> Deployment can be done by command line script with autoscaling. In the
>>>>> newer versions of Kubernnetes you don't even need to specify the node
>>>>> types, you can leave it to the Kubernetes cluster  to scale up and down and
>>>>> decide on node type.
>>>>>
>>>>> The second point is the running spark that you will need to submit.
>>>>> However, that depends on setting up access permission, use of service
>>>>> accounts, pulling the correct dockerfiles for the driver and the executors.
>>>>> Those details add to the complexity.
>>>>>
>>>>> Thanks
>>>>>
>>>>>
>>>>>
>>>>>    view my Linkedin profile
>>>>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>>>>
>>>>>
>>>>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>>>>
>>>>>
>>>>>
>>>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>>>> any loss, damage or destruction of data or any other property which may
>>>>> arise from relying on this email's technical content is explicitly
>>>>> disclaimed. The author will in no case be liable for any monetary damages
>>>>> arising from such loss, damage or destruction.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>>>>>
>>>>>> Hi Spark Community,
>>>>>>
>>>>>> We built an open source tool to deploy and run Spark on Kubernetes
>>>>>> with a one click command. For example, on AWS, it could automatically
>>>>>> create an EKS cluster, node group, NGINX ingress, and Spark Operator. Then
>>>>>> you will be able to use curl or a CLI tool to submit Spark application.
>>>>>> After the deployment, you could also install Uber Remote Shuffle Service to
>>>>>> enable Dynamic Allocation on Kuberentes.
>>>>>>
>>>>>> Anyone interested in using or working together on such a tool?
>>>>>>
>>>>>> Thanks,
>>>>>> Bo
>>>>>>
>>>>>>
>
> --
> Bjørn Jørgensen
> Vestre Aspehaug 4, 6010 Ålesund
> Norge
>
> +47 480 94 297
>

Re: One click to run Spark on Kubernetes

Posted by bo yang <bo...@gmail.com>.
It uses Helm to deploy Spark Operator and Nginx. For other parts like
creating EKS, IAM role, node group, etc, it uses AWS SDK to provision those
AWS resources.

On Wed, Feb 23, 2022 at 11:28 AM Bjørn Jørgensen <bj...@gmail.com>
wrote:

> So if I get this right you will make a Helm <https://helm.sh> chart to
> deploy Spark and some other stuff on K8S?
>
> ons. 23. feb. 2022 kl. 17:49 skrev bo yang <bo...@gmail.com>:
>
>> Hi Sarath, let's follow up offline on this.
>>
>> On Wed, Feb 23, 2022 at 8:32 AM Sarath Annareddy <
>> sarath.annareddy@gmail.com> wrote:
>>
>>> Hi bo
>>>
>>> How do we start?
>>>
>>> Is there a plan? Onboarding, Arch/design diagram, tasks lined up etc
>>>
>>>
>>> Thanks
>>> Sarath
>>>
>>>
>>> Sent from my iPhone
>>>
>>> On Feb 23, 2022, at 10:27 AM, bo yang <bo...@gmail.com> wrote:
>>>
>>> 
>>> Hi Sarath, thanks for your interest and willing to contribute! The
>>> project supports local development using MiniKube. Similarly there is a one
>>> click command with one extra argument to deploy all components in MiniKube,
>>> and people could use that to develop on their local MacBook.
>>>
>>>
>>> On Wed, Feb 23, 2022 at 7:41 AM Sarath Annareddy <
>>> sarath.annareddy@gmail.com> wrote:
>>>
>>>> Hi bo
>>>>
>>>> I am interested to contribute.
>>>> But I don’t have free access to any cloud provider. Not sure how I can
>>>> get free access. I know Google, aws, azure only provides temp free access,
>>>> it may not be sufficient.
>>>>
>>>> Guidance is appreciated.
>>>>
>>>> Sarath
>>>>
>>>> Sent from my iPhone
>>>>
>>>> On Feb 23, 2022, at 2:01 AM, bo yang <bo...@gmail.com> wrote:
>>>>
>>>> 
>>>>
>>>> Right, normally people start with simple script, then add more stuff,
>>>> like permission and more components. After some time, people want to run
>>>> the script consistently in different environments. Things will become
>>>> complex.
>>>>
>>>> That is why we want to see whether people have interest for such a "one
>>>> click" tool to make things easy.
>>>>
>>>>
>>>> On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <
>>>> mich.talebzadeh@gmail.com> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> There are two distinct actions here; namely Deploy and Run.
>>>>>
>>>>> Deployment can be done by command line script with autoscaling. In the
>>>>> newer versions of Kubernnetes you don't even need to specify the node
>>>>> types, you can leave it to the Kubernetes cluster  to scale up and down and
>>>>> decide on node type.
>>>>>
>>>>> The second point is the running spark that you will need to submit.
>>>>> However, that depends on setting up access permission, use of service
>>>>> accounts, pulling the correct dockerfiles for the driver and the executors.
>>>>> Those details add to the complexity.
>>>>>
>>>>> Thanks
>>>>>
>>>>>
>>>>>
>>>>>    view my Linkedin profile
>>>>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>>>>
>>>>>
>>>>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>>>>
>>>>>
>>>>>
>>>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>>>> any loss, damage or destruction of data or any other property which may
>>>>> arise from relying on this email's technical content is explicitly
>>>>> disclaimed. The author will in no case be liable for any monetary damages
>>>>> arising from such loss, damage or destruction.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>>>>>
>>>>>> Hi Spark Community,
>>>>>>
>>>>>> We built an open source tool to deploy and run Spark on Kubernetes
>>>>>> with a one click command. For example, on AWS, it could automatically
>>>>>> create an EKS cluster, node group, NGINX ingress, and Spark Operator. Then
>>>>>> you will be able to use curl or a CLI tool to submit Spark application.
>>>>>> After the deployment, you could also install Uber Remote Shuffle Service to
>>>>>> enable Dynamic Allocation on Kuberentes.
>>>>>>
>>>>>> Anyone interested in using or working together on such a tool?
>>>>>>
>>>>>> Thanks,
>>>>>> Bo
>>>>>>
>>>>>>
>
> --
> Bjørn Jørgensen
> Vestre Aspehaug 4, 6010 Ålesund
> Norge
>
> +47 480 94 297
>

Re: One click to run Spark on Kubernetes

Posted by Bjørn Jørgensen <bj...@gmail.com>.
So if I get this right you will make a Helm <https://helm.sh> chart to
deploy Spark and some other stuff on K8S?

ons. 23. feb. 2022 kl. 17:49 skrev bo yang <bo...@gmail.com>:

> Hi Sarath, let's follow up offline on this.
>
> On Wed, Feb 23, 2022 at 8:32 AM Sarath Annareddy <
> sarath.annareddy@gmail.com> wrote:
>
>> Hi bo
>>
>> How do we start?
>>
>> Is there a plan? Onboarding, Arch/design diagram, tasks lined up etc
>>
>>
>> Thanks
>> Sarath
>>
>>
>> Sent from my iPhone
>>
>> On Feb 23, 2022, at 10:27 AM, bo yang <bo...@gmail.com> wrote:
>>
>> 
>> Hi Sarath, thanks for your interest and willing to contribute! The
>> project supports local development using MiniKube. Similarly there is a one
>> click command with one extra argument to deploy all components in MiniKube,
>> and people could use that to develop on their local MacBook.
>>
>>
>> On Wed, Feb 23, 2022 at 7:41 AM Sarath Annareddy <
>> sarath.annareddy@gmail.com> wrote:
>>
>>> Hi bo
>>>
>>> I am interested to contribute.
>>> But I don’t have free access to any cloud provider. Not sure how I can
>>> get free access. I know Google, aws, azure only provides temp free access,
>>> it may not be sufficient.
>>>
>>> Guidance is appreciated.
>>>
>>> Sarath
>>>
>>> Sent from my iPhone
>>>
>>> On Feb 23, 2022, at 2:01 AM, bo yang <bo...@gmail.com> wrote:
>>>
>>> 
>>>
>>> Right, normally people start with simple script, then add more stuff,
>>> like permission and more components. After some time, people want to run
>>> the script consistently in different environments. Things will become
>>> complex.
>>>
>>> That is why we want to see whether people have interest for such a "one
>>> click" tool to make things easy.
>>>
>>>
>>> On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <
>>> mich.talebzadeh@gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> There are two distinct actions here; namely Deploy and Run.
>>>>
>>>> Deployment can be done by command line script with autoscaling. In the
>>>> newer versions of Kubernnetes you don't even need to specify the node
>>>> types, you can leave it to the Kubernetes cluster  to scale up and down and
>>>> decide on node type.
>>>>
>>>> The second point is the running spark that you will need to submit.
>>>> However, that depends on setting up access permission, use of service
>>>> accounts, pulling the correct dockerfiles for the driver and the executors.
>>>> Those details add to the complexity.
>>>>
>>>> Thanks
>>>>
>>>>
>>>>
>>>>    view my Linkedin profile
>>>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>>>
>>>>
>>>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>>>
>>>>
>>>>
>>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>>> any loss, damage or destruction of data or any other property which may
>>>> arise from relying on this email's technical content is explicitly
>>>> disclaimed. The author will in no case be liable for any monetary damages
>>>> arising from such loss, damage or destruction.
>>>>
>>>>
>>>>
>>>>
>>>> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>>>>
>>>>> Hi Spark Community,
>>>>>
>>>>> We built an open source tool to deploy and run Spark on Kubernetes
>>>>> with a one click command. For example, on AWS, it could automatically
>>>>> create an EKS cluster, node group, NGINX ingress, and Spark Operator. Then
>>>>> you will be able to use curl or a CLI tool to submit Spark application.
>>>>> After the deployment, you could also install Uber Remote Shuffle Service to
>>>>> enable Dynamic Allocation on Kuberentes.
>>>>>
>>>>> Anyone interested in using or working together on such a tool?
>>>>>
>>>>> Thanks,
>>>>> Bo
>>>>>
>>>>>

-- 
Bjørn Jørgensen
Vestre Aspehaug 4, 6010 Ålesund
Norge

+47 480 94 297

Re: One click to run Spark on Kubernetes

Posted by Bjørn Jørgensen <bj...@gmail.com>.
So if I get this right you will make a Helm <https://helm.sh> chart to
deploy Spark and some other stuff on K8S?

ons. 23. feb. 2022 kl. 17:49 skrev bo yang <bo...@gmail.com>:

> Hi Sarath, let's follow up offline on this.
>
> On Wed, Feb 23, 2022 at 8:32 AM Sarath Annareddy <
> sarath.annareddy@gmail.com> wrote:
>
>> Hi bo
>>
>> How do we start?
>>
>> Is there a plan? Onboarding, Arch/design diagram, tasks lined up etc
>>
>>
>> Thanks
>> Sarath
>>
>>
>> Sent from my iPhone
>>
>> On Feb 23, 2022, at 10:27 AM, bo yang <bo...@gmail.com> wrote:
>>
>> 
>> Hi Sarath, thanks for your interest and willing to contribute! The
>> project supports local development using MiniKube. Similarly there is a one
>> click command with one extra argument to deploy all components in MiniKube,
>> and people could use that to develop on their local MacBook.
>>
>>
>> On Wed, Feb 23, 2022 at 7:41 AM Sarath Annareddy <
>> sarath.annareddy@gmail.com> wrote:
>>
>>> Hi bo
>>>
>>> I am interested to contribute.
>>> But I don’t have free access to any cloud provider. Not sure how I can
>>> get free access. I know Google, aws, azure only provides temp free access,
>>> it may not be sufficient.
>>>
>>> Guidance is appreciated.
>>>
>>> Sarath
>>>
>>> Sent from my iPhone
>>>
>>> On Feb 23, 2022, at 2:01 AM, bo yang <bo...@gmail.com> wrote:
>>>
>>> 
>>>
>>> Right, normally people start with simple script, then add more stuff,
>>> like permission and more components. After some time, people want to run
>>> the script consistently in different environments. Things will become
>>> complex.
>>>
>>> That is why we want to see whether people have interest for such a "one
>>> click" tool to make things easy.
>>>
>>>
>>> On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <
>>> mich.talebzadeh@gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> There are two distinct actions here; namely Deploy and Run.
>>>>
>>>> Deployment can be done by command line script with autoscaling. In the
>>>> newer versions of Kubernnetes you don't even need to specify the node
>>>> types, you can leave it to the Kubernetes cluster  to scale up and down and
>>>> decide on node type.
>>>>
>>>> The second point is the running spark that you will need to submit.
>>>> However, that depends on setting up access permission, use of service
>>>> accounts, pulling the correct dockerfiles for the driver and the executors.
>>>> Those details add to the complexity.
>>>>
>>>> Thanks
>>>>
>>>>
>>>>
>>>>    view my Linkedin profile
>>>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>>>
>>>>
>>>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>>>
>>>>
>>>>
>>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>>> any loss, damage or destruction of data or any other property which may
>>>> arise from relying on this email's technical content is explicitly
>>>> disclaimed. The author will in no case be liable for any monetary damages
>>>> arising from such loss, damage or destruction.
>>>>
>>>>
>>>>
>>>>
>>>> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>>>>
>>>>> Hi Spark Community,
>>>>>
>>>>> We built an open source tool to deploy and run Spark on Kubernetes
>>>>> with a one click command. For example, on AWS, it could automatically
>>>>> create an EKS cluster, node group, NGINX ingress, and Spark Operator. Then
>>>>> you will be able to use curl or a CLI tool to submit Spark application.
>>>>> After the deployment, you could also install Uber Remote Shuffle Service to
>>>>> enable Dynamic Allocation on Kuberentes.
>>>>>
>>>>> Anyone interested in using or working together on such a tool?
>>>>>
>>>>> Thanks,
>>>>> Bo
>>>>>
>>>>>

-- 
Bjørn Jørgensen
Vestre Aspehaug 4, 6010 Ålesund
Norge

+47 480 94 297

Re: One click to run Spark on Kubernetes

Posted by bo yang <bo...@gmail.com>.
Hi Sarath, let's follow up offline on this.

On Wed, Feb 23, 2022 at 8:32 AM Sarath Annareddy <sa...@gmail.com>
wrote:

> Hi bo
>
> How do we start?
>
> Is there a plan? Onboarding, Arch/design diagram, tasks lined up etc
>
>
> Thanks
> Sarath
>
>
> Sent from my iPhone
>
> On Feb 23, 2022, at 10:27 AM, bo yang <bo...@gmail.com> wrote:
>
> 
> Hi Sarath, thanks for your interest and willing to contribute! The project
> supports local development using MiniKube. Similarly there is a one click
> command with one extra argument to deploy all components in MiniKube, and
> people could use that to develop on their local MacBook.
>
>
> On Wed, Feb 23, 2022 at 7:41 AM Sarath Annareddy <
> sarath.annareddy@gmail.com> wrote:
>
>> Hi bo
>>
>> I am interested to contribute.
>> But I don’t have free access to any cloud provider. Not sure how I can
>> get free access. I know Google, aws, azure only provides temp free access,
>> it may not be sufficient.
>>
>> Guidance is appreciated.
>>
>> Sarath
>>
>> Sent from my iPhone
>>
>> On Feb 23, 2022, at 2:01 AM, bo yang <bo...@gmail.com> wrote:
>>
>> 
>>
>> Right, normally people start with simple script, then add more stuff,
>> like permission and more components. After some time, people want to run
>> the script consistently in different environments. Things will become
>> complex.
>>
>> That is why we want to see whether people have interest for such a "one
>> click" tool to make things easy.
>>
>>
>> On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <
>> mich.talebzadeh@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> There are two distinct actions here; namely Deploy and Run.
>>>
>>> Deployment can be done by command line script with autoscaling. In the
>>> newer versions of Kubernnetes you don't even need to specify the node
>>> types, you can leave it to the Kubernetes cluster  to scale up and down and
>>> decide on node type.
>>>
>>> The second point is the running spark that you will need to submit.
>>> However, that depends on setting up access permission, use of service
>>> accounts, pulling the correct dockerfiles for the driver and the executors.
>>> Those details add to the complexity.
>>>
>>> Thanks
>>>
>>>
>>>
>>>    view my Linkedin profile
>>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>>
>>>
>>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>>
>>>
>>>
>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>> any loss, damage or destruction of data or any other property which may
>>> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any monetary damages
>>> arising from such loss, damage or destruction.
>>>
>>>
>>>
>>>
>>> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>>>
>>>> Hi Spark Community,
>>>>
>>>> We built an open source tool to deploy and run Spark on Kubernetes with
>>>> a one click command. For example, on AWS, it could automatically create an
>>>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
>>>> be able to use curl or a CLI tool to submit Spark application. After the
>>>> deployment, you could also install Uber Remote Shuffle Service to enable
>>>> Dynamic Allocation on Kuberentes.
>>>>
>>>> Anyone interested in using or working together on such a tool?
>>>>
>>>> Thanks,
>>>> Bo
>>>>
>>>>

Re: One click to run Spark on Kubernetes

Posted by bo yang <bo...@gmail.com>.
Hi Sarath, let's follow up offline on this.

On Wed, Feb 23, 2022 at 8:32 AM Sarath Annareddy <sa...@gmail.com>
wrote:

> Hi bo
>
> How do we start?
>
> Is there a plan? Onboarding, Arch/design diagram, tasks lined up etc
>
>
> Thanks
> Sarath
>
>
> Sent from my iPhone
>
> On Feb 23, 2022, at 10:27 AM, bo yang <bo...@gmail.com> wrote:
>
> 
> Hi Sarath, thanks for your interest and willing to contribute! The project
> supports local development using MiniKube. Similarly there is a one click
> command with one extra argument to deploy all components in MiniKube, and
> people could use that to develop on their local MacBook.
>
>
> On Wed, Feb 23, 2022 at 7:41 AM Sarath Annareddy <
> sarath.annareddy@gmail.com> wrote:
>
>> Hi bo
>>
>> I am interested to contribute.
>> But I don’t have free access to any cloud provider. Not sure how I can
>> get free access. I know Google, aws, azure only provides temp free access,
>> it may not be sufficient.
>>
>> Guidance is appreciated.
>>
>> Sarath
>>
>> Sent from my iPhone
>>
>> On Feb 23, 2022, at 2:01 AM, bo yang <bo...@gmail.com> wrote:
>>
>> 
>>
>> Right, normally people start with simple script, then add more stuff,
>> like permission and more components. After some time, people want to run
>> the script consistently in different environments. Things will become
>> complex.
>>
>> That is why we want to see whether people have interest for such a "one
>> click" tool to make things easy.
>>
>>
>> On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <
>> mich.talebzadeh@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> There are two distinct actions here; namely Deploy and Run.
>>>
>>> Deployment can be done by command line script with autoscaling. In the
>>> newer versions of Kubernnetes you don't even need to specify the node
>>> types, you can leave it to the Kubernetes cluster  to scale up and down and
>>> decide on node type.
>>>
>>> The second point is the running spark that you will need to submit.
>>> However, that depends on setting up access permission, use of service
>>> accounts, pulling the correct dockerfiles for the driver and the executors.
>>> Those details add to the complexity.
>>>
>>> Thanks
>>>
>>>
>>>
>>>    view my Linkedin profile
>>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>>
>>>
>>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>>
>>>
>>>
>>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>>> any loss, damage or destruction of data or any other property which may
>>> arise from relying on this email's technical content is explicitly
>>> disclaimed. The author will in no case be liable for any monetary damages
>>> arising from such loss, damage or destruction.
>>>
>>>
>>>
>>>
>>> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>>>
>>>> Hi Spark Community,
>>>>
>>>> We built an open source tool to deploy and run Spark on Kubernetes with
>>>> a one click command. For example, on AWS, it could automatically create an
>>>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
>>>> be able to use curl or a CLI tool to submit Spark application. After the
>>>> deployment, you could also install Uber Remote Shuffle Service to enable
>>>> Dynamic Allocation on Kuberentes.
>>>>
>>>> Anyone interested in using or working together on such a tool?
>>>>
>>>> Thanks,
>>>> Bo
>>>>
>>>>

Re: One click to run Spark on Kubernetes

Posted by Sarath Annareddy <sa...@gmail.com>.
Hi bo

How do we start?

Is there a plan? Onboarding, Arch/design diagram, tasks lined up etc


Thanks 
Sarath 


Sent from my iPhone

> On Feb 23, 2022, at 10:27 AM, bo yang <bo...@gmail.com> wrote:
> 
> 
> Hi Sarath, thanks for your interest and willing to contribute! The project supports local development using MiniKube. Similarly there is a one click command with one extra argument to deploy all components in MiniKube, and people could use that to develop on their local MacBook.
> 
> 
>> On Wed, Feb 23, 2022 at 7:41 AM Sarath Annareddy <sa...@gmail.com> wrote:
>> Hi bo
>> 
>> I am interested to contribute. 
>> But I don’t have free access to any cloud provider. Not sure how I can get free access. I know Google, aws, azure only provides temp free access, it may not be sufficient.
>> 
>> Guidance is appreciated.
>> 
>> Sarath 
>> 
>> Sent from my iPhone
>> 
>>>> On Feb 23, 2022, at 2:01 AM, bo yang <bo...@gmail.com> wrote:
>>>> 
>>> 
>> 
>>> Right, normally people start with simple script, then add more stuff, like permission and more components. After some time, people want to run the script consistently in different environments. Things will become complex.
>>> 
>>> That is why we want to see whether people have interest for such a "one click" tool to make things easy.
>>> 
>>> 
>>>> On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <mi...@gmail.com> wrote:
>>>> Hi,
>>>> 
>>>> There are two distinct actions here; namely Deploy and Run.
>>>> 
>>>> Deployment can be done by command line script with autoscaling. In the newer versions of Kubernnetes you don't even need to specify the node types, you can leave it to the Kubernetes cluster  to scale up and down and decide on node type.
>>>> 
>>>> The second point is the running spark that you will need to submit. However, that depends on setting up access permission, use of service accounts, pulling the correct dockerfiles for the driver and the executors. Those details add to the complexity.
>>>> 
>>>> Thanks
>>>> 
>>>> 
>>>>    view my Linkedin profile
>>>> 
>>>> 
>>>> 
>>>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>>> 
>>>>  
>>>> Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.
>>>>  
>>>> 
>>>> 
>>>>> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>>>>> Hi Spark Community,
>>>>> 
>>>>> We built an open source tool to deploy and run Spark on Kubernetes with a one click command. For example, on AWS, it could automatically create an EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will be able to use curl or a CLI tool to submit Spark application. After the deployment, you could also install Uber Remote Shuffle Service to enable Dynamic Allocation on Kuberentes.
>>>>> 
>>>>> Anyone interested in using or working together on such a tool?
>>>>> 
>>>>> Thanks,
>>>>> Bo
>>>>> 

Re: One click to run Spark on Kubernetes

Posted by Sarath Annareddy <sa...@gmail.com>.
Hi bo

How do we start?

Is there a plan? Onboarding, Arch/design diagram, tasks lined up etc


Thanks 
Sarath 


Sent from my iPhone

> On Feb 23, 2022, at 10:27 AM, bo yang <bo...@gmail.com> wrote:
> 
> 
> Hi Sarath, thanks for your interest and willing to contribute! The project supports local development using MiniKube. Similarly there is a one click command with one extra argument to deploy all components in MiniKube, and people could use that to develop on their local MacBook.
> 
> 
>> On Wed, Feb 23, 2022 at 7:41 AM Sarath Annareddy <sa...@gmail.com> wrote:
>> Hi bo
>> 
>> I am interested to contribute. 
>> But I don’t have free access to any cloud provider. Not sure how I can get free access. I know Google, aws, azure only provides temp free access, it may not be sufficient.
>> 
>> Guidance is appreciated.
>> 
>> Sarath 
>> 
>> Sent from my iPhone
>> 
>>>> On Feb 23, 2022, at 2:01 AM, bo yang <bo...@gmail.com> wrote:
>>>> 
>>> 
>> 
>>> Right, normally people start with simple script, then add more stuff, like permission and more components. After some time, people want to run the script consistently in different environments. Things will become complex.
>>> 
>>> That is why we want to see whether people have interest for such a "one click" tool to make things easy.
>>> 
>>> 
>>>> On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <mi...@gmail.com> wrote:
>>>> Hi,
>>>> 
>>>> There are two distinct actions here; namely Deploy and Run.
>>>> 
>>>> Deployment can be done by command line script with autoscaling. In the newer versions of Kubernnetes you don't even need to specify the node types, you can leave it to the Kubernetes cluster  to scale up and down and decide on node type.
>>>> 
>>>> The second point is the running spark that you will need to submit. However, that depends on setting up access permission, use of service accounts, pulling the correct dockerfiles for the driver and the executors. Those details add to the complexity.
>>>> 
>>>> Thanks
>>>> 
>>>> 
>>>>    view my Linkedin profile
>>>> 
>>>> 
>>>> 
>>>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>>> 
>>>>  
>>>> Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.
>>>>  
>>>> 
>>>> 
>>>>> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>>>>> Hi Spark Community,
>>>>> 
>>>>> We built an open source tool to deploy and run Spark on Kubernetes with a one click command. For example, on AWS, it could automatically create an EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will be able to use curl or a CLI tool to submit Spark application. After the deployment, you could also install Uber Remote Shuffle Service to enable Dynamic Allocation on Kuberentes.
>>>>> 
>>>>> Anyone interested in using or working together on such a tool?
>>>>> 
>>>>> Thanks,
>>>>> Bo
>>>>> 

Re: One click to run Spark on Kubernetes

Posted by bo yang <bo...@gmail.com>.
Hi Sarath, thanks for your interest and willing to contribute! The project
supports local development using MiniKube. Similarly there is a one click
command with one extra argument to deploy all components in MiniKube, and
people could use that to develop on their local MacBook.


On Wed, Feb 23, 2022 at 7:41 AM Sarath Annareddy <sa...@gmail.com>
wrote:

> Hi bo
>
> I am interested to contribute.
> But I don’t have free access to any cloud provider. Not sure how I can get
> free access. I know Google, aws, azure only provides temp free access, it
> may not be sufficient.
>
> Guidance is appreciated.
>
> Sarath
>
> Sent from my iPhone
>
> On Feb 23, 2022, at 2:01 AM, bo yang <bo...@gmail.com> wrote:
>
> 
>
> Right, normally people start with simple script, then add more stuff, like
> permission and more components. After some time, people want to run the
> script consistently in different environments. Things will become complex.
>
> That is why we want to see whether people have interest for such a "one
> click" tool to make things easy.
>
>
> On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <
> mich.talebzadeh@gmail.com> wrote:
>
>> Hi,
>>
>> There are two distinct actions here; namely Deploy and Run.
>>
>> Deployment can be done by command line script with autoscaling. In the
>> newer versions of Kubernnetes you don't even need to specify the node
>> types, you can leave it to the Kubernetes cluster  to scale up and down and
>> decide on node type.
>>
>> The second point is the running spark that you will need to submit.
>> However, that depends on setting up access permission, use of service
>> accounts, pulling the correct dockerfiles for the driver and the executors.
>> Those details add to the complexity.
>>
>> Thanks
>>
>>
>>
>>    view my Linkedin profile
>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>
>>
>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>>
>> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>>
>>> Hi Spark Community,
>>>
>>> We built an open source tool to deploy and run Spark on Kubernetes with
>>> a one click command. For example, on AWS, it could automatically create an
>>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
>>> be able to use curl or a CLI tool to submit Spark application. After the
>>> deployment, you could also install Uber Remote Shuffle Service to enable
>>> Dynamic Allocation on Kuberentes.
>>>
>>> Anyone interested in using or working together on such a tool?
>>>
>>> Thanks,
>>> Bo
>>>
>>>

Re: One click to run Spark on Kubernetes

Posted by bo yang <bo...@gmail.com>.
Hi Sarath, thanks for your interest and willing to contribute! The project
supports local development using MiniKube. Similarly there is a one click
command with one extra argument to deploy all components in MiniKube, and
people could use that to develop on their local MacBook.


On Wed, Feb 23, 2022 at 7:41 AM Sarath Annareddy <sa...@gmail.com>
wrote:

> Hi bo
>
> I am interested to contribute.
> But I don’t have free access to any cloud provider. Not sure how I can get
> free access. I know Google, aws, azure only provides temp free access, it
> may not be sufficient.
>
> Guidance is appreciated.
>
> Sarath
>
> Sent from my iPhone
>
> On Feb 23, 2022, at 2:01 AM, bo yang <bo...@gmail.com> wrote:
>
> 
>
> Right, normally people start with simple script, then add more stuff, like
> permission and more components. After some time, people want to run the
> script consistently in different environments. Things will become complex.
>
> That is why we want to see whether people have interest for such a "one
> click" tool to make things easy.
>
>
> On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <
> mich.talebzadeh@gmail.com> wrote:
>
>> Hi,
>>
>> There are two distinct actions here; namely Deploy and Run.
>>
>> Deployment can be done by command line script with autoscaling. In the
>> newer versions of Kubernnetes you don't even need to specify the node
>> types, you can leave it to the Kubernetes cluster  to scale up and down and
>> decide on node type.
>>
>> The second point is the running spark that you will need to submit.
>> However, that depends on setting up access permission, use of service
>> accounts, pulling the correct dockerfiles for the driver and the executors.
>> Those details add to the complexity.
>>
>> Thanks
>>
>>
>>
>>    view my Linkedin profile
>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>
>>
>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>>
>> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>>
>>> Hi Spark Community,
>>>
>>> We built an open source tool to deploy and run Spark on Kubernetes with
>>> a one click command. For example, on AWS, it could automatically create an
>>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
>>> be able to use curl or a CLI tool to submit Spark application. After the
>>> deployment, you could also install Uber Remote Shuffle Service to enable
>>> Dynamic Allocation on Kuberentes.
>>>
>>> Anyone interested in using or working together on such a tool?
>>>
>>> Thanks,
>>> Bo
>>>
>>>

Re: One click to run Spark on Kubernetes

Posted by Sarath Annareddy <sa...@gmail.com>.
Hi bo

I am interested to contribute. 
But I don’t have free access to any cloud provider. Not sure how I can get free access. I know Google, aws, azure only provides temp free access, it may not be sufficient.

Guidance is appreciated.

Sarath 

Sent from my iPhone

> On Feb 23, 2022, at 2:01 AM, bo yang <bo...@gmail.com> wrote:
> 
> 
> Right, normally people start with simple script, then add more stuff, like permission and more components. After some time, people want to run the script consistently in different environments. Things will become complex.
> 
> That is why we want to see whether people have interest for such a "one click" tool to make things easy.
> 
> 
>> On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <mi...@gmail.com> wrote:
>> Hi,
>> 
>> There are two distinct actions here; namely Deploy and Run.
>> 
>> Deployment can be done by command line script with autoscaling. In the newer versions of Kubernnetes you don't even need to specify the node types, you can leave it to the Kubernetes cluster  to scale up and down and decide on node type.
>> 
>> The second point is the running spark that you will need to submit. However, that depends on setting up access permission, use of service accounts, pulling the correct dockerfiles for the driver and the executors. Those details add to the complexity.
>> 
>> Thanks
>> 
>> 
>>    view my Linkedin profile
>> 
>> 
>> 
>>  https://en.everybodywiki.com/Mich_Talebzadeh
>> 
>>  
>> Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.
>>  
>> 
>> 
>>> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>>> Hi Spark Community,
>>> 
>>> We built an open source tool to deploy and run Spark on Kubernetes with a one click command. For example, on AWS, it could automatically create an EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will be able to use curl or a CLI tool to submit Spark application. After the deployment, you could also install Uber Remote Shuffle Service to enable Dynamic Allocation on Kuberentes.
>>> 
>>> Anyone interested in using or working together on such a tool?
>>> 
>>> Thanks,
>>> Bo
>>> 

Re: One click to run Spark on Kubernetes

Posted by Sarath Annareddy <sa...@gmail.com>.
Hi bo

I am interested to contribute. 
But I don’t have free access to any cloud provider. Not sure how I can get free access. I know Google, aws, azure only provides temp free access, it may not be sufficient.

Guidance is appreciated.

Sarath 

Sent from my iPhone

> On Feb 23, 2022, at 2:01 AM, bo yang <bo...@gmail.com> wrote:
> 
> 
> Right, normally people start with simple script, then add more stuff, like permission and more components. After some time, people want to run the script consistently in different environments. Things will become complex.
> 
> That is why we want to see whether people have interest for such a "one click" tool to make things easy.
> 
> 
>> On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <mi...@gmail.com> wrote:
>> Hi,
>> 
>> There are two distinct actions here; namely Deploy and Run.
>> 
>> Deployment can be done by command line script with autoscaling. In the newer versions of Kubernnetes you don't even need to specify the node types, you can leave it to the Kubernetes cluster  to scale up and down and decide on node type.
>> 
>> The second point is the running spark that you will need to submit. However, that depends on setting up access permission, use of service accounts, pulling the correct dockerfiles for the driver and the executors. Those details add to the complexity.
>> 
>> Thanks
>> 
>> 
>>    view my Linkedin profile
>> 
>> 
>> 
>>  https://en.everybodywiki.com/Mich_Talebzadeh
>> 
>>  
>> Disclaimer: Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.
>>  
>> 
>> 
>>> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>>> Hi Spark Community,
>>> 
>>> We built an open source tool to deploy and run Spark on Kubernetes with a one click command. For example, on AWS, it could automatically create an EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will be able to use curl or a CLI tool to submit Spark application. After the deployment, you could also install Uber Remote Shuffle Service to enable Dynamic Allocation on Kuberentes.
>>> 
>>> Anyone interested in using or working together on such a tool?
>>> 
>>> Thanks,
>>> Bo
>>> 

Re: One click to run Spark on Kubernetes

Posted by Bitfox <bi...@bitfox.top>.
from my viewpoints, if there is such a pay as you go service I would like
to use.
otherwise I have to deploy a regular spark cluster with GCP/AWS etc and the
cost is not low.

Thanks.

On Wed, Feb 23, 2022 at 4:00 PM bo yang <bo...@gmail.com> wrote:

> Right, normally people start with simple script, then add more stuff, like
> permission and more components. After some time, people want to run the
> script consistently in different environments. Things will become complex.
>
> That is why we want to see whether people have interest for such a "one
> click" tool to make things easy.
>
>
> On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <
> mich.talebzadeh@gmail.com> wrote:
>
>> Hi,
>>
>> There are two distinct actions here; namely Deploy and Run.
>>
>> Deployment can be done by command line script with autoscaling. In the
>> newer versions of Kubernnetes you don't even need to specify the node
>> types, you can leave it to the Kubernetes cluster  to scale up and down and
>> decide on node type.
>>
>> The second point is the running spark that you will need to submit.
>> However, that depends on setting up access permission, use of service
>> accounts, pulling the correct dockerfiles for the driver and the executors.
>> Those details add to the complexity.
>>
>> Thanks
>>
>>
>>
>>    view my Linkedin profile
>> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>>
>>
>>  https://en.everybodywiki.com/Mich_Talebzadeh
>>
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>>
>> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>>
>>> Hi Spark Community,
>>>
>>> We built an open source tool to deploy and run Spark on Kubernetes with
>>> a one click command. For example, on AWS, it could automatically create an
>>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
>>> be able to use curl or a CLI tool to submit Spark application. After the
>>> deployment, you could also install Uber Remote Shuffle Service to enable
>>> Dynamic Allocation on Kuberentes.
>>>
>>> Anyone interested in using or working together on such a tool?
>>>
>>> Thanks,
>>> Bo
>>>
>>>

Re: One click to run Spark on Kubernetes

Posted by bo yang <bo...@gmail.com>.
Right, normally people start with simple script, then add more stuff, like
permission and more components. After some time, people want to run the
script consistently in different environments. Things will become complex.

That is why we want to see whether people have interest for such a "one
click" tool to make things easy.


On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <mi...@gmail.com>
wrote:

> Hi,
>
> There are two distinct actions here; namely Deploy and Run.
>
> Deployment can be done by command line script with autoscaling. In the
> newer versions of Kubernnetes you don't even need to specify the node
> types, you can leave it to the Kubernetes cluster  to scale up and down and
> decide on node type.
>
> The second point is the running spark that you will need to submit.
> However, that depends on setting up access permission, use of service
> accounts, pulling the correct dockerfiles for the driver and the executors.
> Those details add to the complexity.
>
> Thanks
>
>
>
>    view my Linkedin profile
> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>
>
>  https://en.everybodywiki.com/Mich_Talebzadeh
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>
>> Hi Spark Community,
>>
>> We built an open source tool to deploy and run Spark on Kubernetes with a
>> one click command. For example, on AWS, it could automatically create an
>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
>> be able to use curl or a CLI tool to submit Spark application. After the
>> deployment, you could also install Uber Remote Shuffle Service to enable
>> Dynamic Allocation on Kuberentes.
>>
>> Anyone interested in using or working together on such a tool?
>>
>> Thanks,
>> Bo
>>
>>

Re: One click to run Spark on Kubernetes

Posted by bo yang <bo...@gmail.com>.
Right, normally people start with simple script, then add more stuff, like
permission and more components. After some time, people want to run the
script consistently in different environments. Things will become complex.

That is why we want to see whether people have interest for such a "one
click" tool to make things easy.


On Tue, Feb 22, 2022 at 11:31 PM Mich Talebzadeh <mi...@gmail.com>
wrote:

> Hi,
>
> There are two distinct actions here; namely Deploy and Run.
>
> Deployment can be done by command line script with autoscaling. In the
> newer versions of Kubernnetes you don't even need to specify the node
> types, you can leave it to the Kubernetes cluster  to scale up and down and
> decide on node type.
>
> The second point is the running spark that you will need to submit.
> However, that depends on setting up access permission, use of service
> accounts, pulling the correct dockerfiles for the driver and the executors.
> Those details add to the complexity.
>
> Thanks
>
>
>
>    view my Linkedin profile
> <https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>
>
>
>  https://en.everybodywiki.com/Mich_Talebzadeh
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:
>
>> Hi Spark Community,
>>
>> We built an open source tool to deploy and run Spark on Kubernetes with a
>> one click command. For example, on AWS, it could automatically create an
>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
>> be able to use curl or a CLI tool to submit Spark application. After the
>> deployment, you could also install Uber Remote Shuffle Service to enable
>> Dynamic Allocation on Kuberentes.
>>
>> Anyone interested in using or working together on such a tool?
>>
>> Thanks,
>> Bo
>>
>>

Re: One click to run Spark on Kubernetes

Posted by Mich Talebzadeh <mi...@gmail.com>.
Hi,

There are two distinct actions here; namely Deploy and Run.

Deployment can be done by command line script with autoscaling. In the
newer versions of Kubernnetes you don't even need to specify the node
types, you can leave it to the Kubernetes cluster  to scale up and down and
decide on node type.

The second point is the running spark that you will need to submit.
However, that depends on setting up access permission, use of service
accounts, pulling the correct dockerfiles for the driver and the executors.
Those details add to the complexity.

Thanks



   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>


 https://en.everybodywiki.com/Mich_Talebzadeh



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:

> Hi Spark Community,
>
> We built an open source tool to deploy and run Spark on Kubernetes with a
> one click command. For example, on AWS, it could automatically create an
> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
> be able to use curl or a CLI tool to submit Spark application. After the
> deployment, you could also install Uber Remote Shuffle Service to enable
> Dynamic Allocation on Kuberentes.
>
> Anyone interested in using or working together on such a tool?
>
> Thanks,
> Bo
>
>

Re: One click to run Spark on Kubernetes

Posted by bo yang <bo...@gmail.com>.
Merging another email from Prasad. It could co-exist with livy. Livy is
similar like the REST Service + Spark Operator. Unfortunately Livy is not
very active right now.

To Amihay, the link is: https://github.com/datapunchorg/punch.

On Tue, Feb 22, 2022 at 8:53 PM amihay gonen <ag...@gmail.com> wrote:

> Can you share link to the source?
>
> בתאריך יום ד׳, 23 בפבר׳ 2022, 6:52, מאת bo yang ‏<bo...@gmail.com>:
>
>> We do not have SaaS yet. Now it is an open source project we build in our
>> part time , and we welcome more people working together on that.
>>
>> You could specify cluster size (EC2 instance type and number of
>> instances) and run it for 1 hour. Then you could run one click command to
>> destroy the cluster. It is possible to merge these steps as well, and
>> provide a "serverless" experience. That is in our TODO list :)
>>
>>
>> On Tue, Feb 22, 2022 at 8:36 PM Bitfox <bi...@bitfox.top> wrote:
>>
>>> How can I specify the cluster memory and cores?
>>> For instance, I want to run a job with 16 cores and 300 GB memory for
>>> about 1 hour. Do you have the SaaS solution for this? I can pay as I did.
>>>
>>> Thanks
>>>
>>> On Wed, Feb 23, 2022 at 12:21 PM bo yang <bo...@gmail.com> wrote:
>>>
>>>> It is not a standalone spark cluster. In some details, it deploys a
>>>> Spark Operator (
>>>> https://github.com/GoogleCloudPlatform/spark-on-k8s-operator) and an
>>>> extra REST Service. When people submit Spark application to that REST
>>>> Service, the REST Service will create a CRD inside the Kubernetes cluster.
>>>> Then Spark Operator will pick up the CRD and launch the Spark application.
>>>> The one click tool intends to hide these details, so people could just
>>>> submit Spark and do not need to deal with too many deployment details.
>>>>
>>>> On Tue, Feb 22, 2022 at 8:09 PM Bitfox <bi...@bitfox.top> wrote:
>>>>
>>>>> Can it be a cluster installation of spark? or just the standalone node?
>>>>>
>>>>> Thanks
>>>>>
>>>>> On Wed, Feb 23, 2022 at 12:06 PM bo yang <bo...@gmail.com> wrote:
>>>>>
>>>>>> Hi Spark Community,
>>>>>>
>>>>>> We built an open source tool to deploy and run Spark on Kubernetes
>>>>>> with a one click command. For example, on AWS, it could automatically
>>>>>> create an EKS cluster, node group, NGINX ingress, and Spark Operator. Then
>>>>>> you will be able to use curl or a CLI tool to submit Spark application.
>>>>>> After the deployment, you could also install Uber Remote Shuffle Service to
>>>>>> enable Dynamic Allocation on Kuberentes.
>>>>>>
>>>>>> Anyone interested in using or working together on such a tool?
>>>>>>
>>>>>> Thanks,
>>>>>> Bo
>>>>>>
>>>>>>

Re: One click to run Spark on Kubernetes

Posted by bo yang <bo...@gmail.com>.
Merging another email from Prasad. It could co-exist with livy. Livy is
similar like the REST Service + Spark Operator. Unfortunately Livy is not
very active right now.

To Amihay, the link is: https://github.com/datapunchorg/punch.

On Tue, Feb 22, 2022 at 8:53 PM amihay gonen <ag...@gmail.com> wrote:

> Can you share link to the source?
>
> בתאריך יום ד׳, 23 בפבר׳ 2022, 6:52, מאת bo yang ‏<bo...@gmail.com>:
>
>> We do not have SaaS yet. Now it is an open source project we build in our
>> part time , and we welcome more people working together on that.
>>
>> You could specify cluster size (EC2 instance type and number of
>> instances) and run it for 1 hour. Then you could run one click command to
>> destroy the cluster. It is possible to merge these steps as well, and
>> provide a "serverless" experience. That is in our TODO list :)
>>
>>
>> On Tue, Feb 22, 2022 at 8:36 PM Bitfox <bi...@bitfox.top> wrote:
>>
>>> How can I specify the cluster memory and cores?
>>> For instance, I want to run a job with 16 cores and 300 GB memory for
>>> about 1 hour. Do you have the SaaS solution for this? I can pay as I did.
>>>
>>> Thanks
>>>
>>> On Wed, Feb 23, 2022 at 12:21 PM bo yang <bo...@gmail.com> wrote:
>>>
>>>> It is not a standalone spark cluster. In some details, it deploys a
>>>> Spark Operator (
>>>> https://github.com/GoogleCloudPlatform/spark-on-k8s-operator) and an
>>>> extra REST Service. When people submit Spark application to that REST
>>>> Service, the REST Service will create a CRD inside the Kubernetes cluster.
>>>> Then Spark Operator will pick up the CRD and launch the Spark application.
>>>> The one click tool intends to hide these details, so people could just
>>>> submit Spark and do not need to deal with too many deployment details.
>>>>
>>>> On Tue, Feb 22, 2022 at 8:09 PM Bitfox <bi...@bitfox.top> wrote:
>>>>
>>>>> Can it be a cluster installation of spark? or just the standalone node?
>>>>>
>>>>> Thanks
>>>>>
>>>>> On Wed, Feb 23, 2022 at 12:06 PM bo yang <bo...@gmail.com> wrote:
>>>>>
>>>>>> Hi Spark Community,
>>>>>>
>>>>>> We built an open source tool to deploy and run Spark on Kubernetes
>>>>>> with a one click command. For example, on AWS, it could automatically
>>>>>> create an EKS cluster, node group, NGINX ingress, and Spark Operator. Then
>>>>>> you will be able to use curl or a CLI tool to submit Spark application.
>>>>>> After the deployment, you could also install Uber Remote Shuffle Service to
>>>>>> enable Dynamic Allocation on Kuberentes.
>>>>>>
>>>>>> Anyone interested in using or working together on such a tool?
>>>>>>
>>>>>> Thanks,
>>>>>> Bo
>>>>>>
>>>>>>

Re: One click to run Spark on Kubernetes

Posted by amihay gonen <ag...@gmail.com>.
Can you share link to the source?

בתאריך יום ד׳, 23 בפבר׳ 2022, 6:52, מאת bo yang ‏<bo...@gmail.com>:

> We do not have SaaS yet. Now it is an open source project we build in our
> part time , and we welcome more people working together on that.
>
> You could specify cluster size (EC2 instance type and number of instances)
> and run it for 1 hour. Then you could run one click command to destroy the
> cluster. It is possible to merge these steps as well, and provide a
> "serverless" experience. That is in our TODO list :)
>
>
> On Tue, Feb 22, 2022 at 8:36 PM Bitfox <bi...@bitfox.top> wrote:
>
>> How can I specify the cluster memory and cores?
>> For instance, I want to run a job with 16 cores and 300 GB memory for
>> about 1 hour. Do you have the SaaS solution for this? I can pay as I did.
>>
>> Thanks
>>
>> On Wed, Feb 23, 2022 at 12:21 PM bo yang <bo...@gmail.com> wrote:
>>
>>> It is not a standalone spark cluster. In some details, it deploys a
>>> Spark Operator (
>>> https://github.com/GoogleCloudPlatform/spark-on-k8s-operator) and an
>>> extra REST Service. When people submit Spark application to that REST
>>> Service, the REST Service will create a CRD inside the Kubernetes cluster.
>>> Then Spark Operator will pick up the CRD and launch the Spark application.
>>> The one click tool intends to hide these details, so people could just
>>> submit Spark and do not need to deal with too many deployment details.
>>>
>>> On Tue, Feb 22, 2022 at 8:09 PM Bitfox <bi...@bitfox.top> wrote:
>>>
>>>> Can it be a cluster installation of spark? or just the standalone node?
>>>>
>>>> Thanks
>>>>
>>>> On Wed, Feb 23, 2022 at 12:06 PM bo yang <bo...@gmail.com> wrote:
>>>>
>>>>> Hi Spark Community,
>>>>>
>>>>> We built an open source tool to deploy and run Spark on Kubernetes
>>>>> with a one click command. For example, on AWS, it could automatically
>>>>> create an EKS cluster, node group, NGINX ingress, and Spark Operator. Then
>>>>> you will be able to use curl or a CLI tool to submit Spark application.
>>>>> After the deployment, you could also install Uber Remote Shuffle Service to
>>>>> enable Dynamic Allocation on Kuberentes.
>>>>>
>>>>> Anyone interested in using or working together on such a tool?
>>>>>
>>>>> Thanks,
>>>>> Bo
>>>>>
>>>>>

Re: One click to run Spark on Kubernetes

Posted by bo yang <bo...@gmail.com>.
We do not have SaaS yet. Now it is an open source project we build in our
part time , and we welcome more people working together on that.

You could specify cluster size (EC2 instance type and number of instances)
and run it for 1 hour. Then you could run one click command to destroy the
cluster. It is possible to merge these steps as well, and provide a
"serverless" experience. That is in our TODO list :)


On Tue, Feb 22, 2022 at 8:36 PM Bitfox <bi...@bitfox.top> wrote:

> How can I specify the cluster memory and cores?
> For instance, I want to run a job with 16 cores and 300 GB memory for
> about 1 hour. Do you have the SaaS solution for this? I can pay as I did.
>
> Thanks
>
> On Wed, Feb 23, 2022 at 12:21 PM bo yang <bo...@gmail.com> wrote:
>
>> It is not a standalone spark cluster. In some details, it deploys a Spark
>> Operator (https://github.com/GoogleCloudPlatform/spark-on-k8s-operator)
>> and an extra REST Service. When people submit Spark application to that
>> REST Service, the REST Service will create a CRD inside the
>> Kubernetes cluster. Then Spark Operator will pick up the CRD and launch the
>> Spark application. The one click tool intends to hide these details, so
>> people could just submit Spark and do not need to deal with too many
>> deployment details.
>>
>> On Tue, Feb 22, 2022 at 8:09 PM Bitfox <bi...@bitfox.top> wrote:
>>
>>> Can it be a cluster installation of spark? or just the standalone node?
>>>
>>> Thanks
>>>
>>> On Wed, Feb 23, 2022 at 12:06 PM bo yang <bo...@gmail.com> wrote:
>>>
>>>> Hi Spark Community,
>>>>
>>>> We built an open source tool to deploy and run Spark on Kubernetes with
>>>> a one click command. For example, on AWS, it could automatically create an
>>>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
>>>> be able to use curl or a CLI tool to submit Spark application. After the
>>>> deployment, you could also install Uber Remote Shuffle Service to enable
>>>> Dynamic Allocation on Kuberentes.
>>>>
>>>> Anyone interested in using or working together on such a tool?
>>>>
>>>> Thanks,
>>>> Bo
>>>>
>>>>

Re: One click to run Spark on Kubernetes

Posted by bo yang <bo...@gmail.com>.
We do not have SaaS yet. Now it is an open source project we build in our
part time , and we welcome more people working together on that.

You could specify cluster size (EC2 instance type and number of instances)
and run it for 1 hour. Then you could run one click command to destroy the
cluster. It is possible to merge these steps as well, and provide a
"serverless" experience. That is in our TODO list :)


On Tue, Feb 22, 2022 at 8:36 PM Bitfox <bi...@bitfox.top> wrote:

> How can I specify the cluster memory and cores?
> For instance, I want to run a job with 16 cores and 300 GB memory for
> about 1 hour. Do you have the SaaS solution for this? I can pay as I did.
>
> Thanks
>
> On Wed, Feb 23, 2022 at 12:21 PM bo yang <bo...@gmail.com> wrote:
>
>> It is not a standalone spark cluster. In some details, it deploys a Spark
>> Operator (https://github.com/GoogleCloudPlatform/spark-on-k8s-operator)
>> and an extra REST Service. When people submit Spark application to that
>> REST Service, the REST Service will create a CRD inside the
>> Kubernetes cluster. Then Spark Operator will pick up the CRD and launch the
>> Spark application. The one click tool intends to hide these details, so
>> people could just submit Spark and do not need to deal with too many
>> deployment details.
>>
>> On Tue, Feb 22, 2022 at 8:09 PM Bitfox <bi...@bitfox.top> wrote:
>>
>>> Can it be a cluster installation of spark? or just the standalone node?
>>>
>>> Thanks
>>>
>>> On Wed, Feb 23, 2022 at 12:06 PM bo yang <bo...@gmail.com> wrote:
>>>
>>>> Hi Spark Community,
>>>>
>>>> We built an open source tool to deploy and run Spark on Kubernetes with
>>>> a one click command. For example, on AWS, it could automatically create an
>>>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
>>>> be able to use curl or a CLI tool to submit Spark application. After the
>>>> deployment, you could also install Uber Remote Shuffle Service to enable
>>>> Dynamic Allocation on Kuberentes.
>>>>
>>>> Anyone interested in using or working together on such a tool?
>>>>
>>>> Thanks,
>>>> Bo
>>>>
>>>>

Re: One click to run Spark on Kubernetes

Posted by Bitfox <bi...@bitfox.top>.
How can I specify the cluster memory and cores?
For instance, I want to run a job with 16 cores and 300 GB memory for about
1 hour. Do you have the SaaS solution for this? I can pay as I did.

Thanks

On Wed, Feb 23, 2022 at 12:21 PM bo yang <bo...@gmail.com> wrote:

> It is not a standalone spark cluster. In some details, it deploys a Spark
> Operator (https://github.com/GoogleCloudPlatform/spark-on-k8s-operator)
> and an extra REST Service. When people submit Spark application to that
> REST Service, the REST Service will create a CRD inside the
> Kubernetes cluster. Then Spark Operator will pick up the CRD and launch the
> Spark application. The one click tool intends to hide these details, so
> people could just submit Spark and do not need to deal with too many
> deployment details.
>
> On Tue, Feb 22, 2022 at 8:09 PM Bitfox <bi...@bitfox.top> wrote:
>
>> Can it be a cluster installation of spark? or just the standalone node?
>>
>> Thanks
>>
>> On Wed, Feb 23, 2022 at 12:06 PM bo yang <bo...@gmail.com> wrote:
>>
>>> Hi Spark Community,
>>>
>>> We built an open source tool to deploy and run Spark on Kubernetes with
>>> a one click command. For example, on AWS, it could automatically create an
>>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
>>> be able to use curl or a CLI tool to submit Spark application. After the
>>> deployment, you could also install Uber Remote Shuffle Service to enable
>>> Dynamic Allocation on Kuberentes.
>>>
>>> Anyone interested in using or working together on such a tool?
>>>
>>> Thanks,
>>> Bo
>>>
>>>

Re: One click to run Spark on Kubernetes

Posted by Prasad Paravatha <pr...@gmail.com>.
Hi Bo Yang,
Would it be something along the lines of Apache livy?

Thanks,
Prasad


On Tue, Feb 22, 2022 at 10:22 PM bo yang <bo...@gmail.com> wrote:

> It is not a standalone spark cluster. In some details, it deploys a Spark
> Operator (https://github.com/GoogleCloudPlatform/spark-on-k8s-operator)
> and an extra REST Service. When people submit Spark application to that
> REST Service, the REST Service will create a CRD inside the
> Kubernetes cluster. Then Spark Operator will pick up the CRD and launch the
> Spark application. The one click tool intends to hide these details, so
> people could just submit Spark and do not need to deal with too many
> deployment details.
>
> On Tue, Feb 22, 2022 at 8:09 PM Bitfox <bi...@bitfox.top> wrote:
>
>> Can it be a cluster installation of spark? or just the standalone node?
>>
>> Thanks
>>
>> On Wed, Feb 23, 2022 at 12:06 PM bo yang <bo...@gmail.com> wrote:
>>
>>> Hi Spark Community,
>>>
>>> We built an open source tool to deploy and run Spark on Kubernetes with
>>> a one click command. For example, on AWS, it could automatically create an
>>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
>>> be able to use curl or a CLI tool to submit Spark application. After the
>>> deployment, you could also install Uber Remote Shuffle Service to enable
>>> Dynamic Allocation on Kuberentes.
>>>
>>> Anyone interested in using or working together on such a tool?
>>>
>>> Thanks,
>>> Bo
>>>
>>>

-- 
Regards,
Prasad Paravatha

Re: One click to run Spark on Kubernetes

Posted by Prasad Paravatha <pr...@gmail.com>.
Hi Bo Yang,
Would it be something along the lines of Apache livy?

Thanks,
Prasad


On Tue, Feb 22, 2022 at 10:22 PM bo yang <bo...@gmail.com> wrote:

> It is not a standalone spark cluster. In some details, it deploys a Spark
> Operator (https://github.com/GoogleCloudPlatform/spark-on-k8s-operator)
> and an extra REST Service. When people submit Spark application to that
> REST Service, the REST Service will create a CRD inside the
> Kubernetes cluster. Then Spark Operator will pick up the CRD and launch the
> Spark application. The one click tool intends to hide these details, so
> people could just submit Spark and do not need to deal with too many
> deployment details.
>
> On Tue, Feb 22, 2022 at 8:09 PM Bitfox <bi...@bitfox.top> wrote:
>
>> Can it be a cluster installation of spark? or just the standalone node?
>>
>> Thanks
>>
>> On Wed, Feb 23, 2022 at 12:06 PM bo yang <bo...@gmail.com> wrote:
>>
>>> Hi Spark Community,
>>>
>>> We built an open source tool to deploy and run Spark on Kubernetes with
>>> a one click command. For example, on AWS, it could automatically create an
>>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
>>> be able to use curl or a CLI tool to submit Spark application. After the
>>> deployment, you could also install Uber Remote Shuffle Service to enable
>>> Dynamic Allocation on Kuberentes.
>>>
>>> Anyone interested in using or working together on such a tool?
>>>
>>> Thanks,
>>> Bo
>>>
>>>

-- 
Regards,
Prasad Paravatha

Re: One click to run Spark on Kubernetes

Posted by bo yang <bo...@gmail.com>.
It is not a standalone spark cluster. In some details, it deploys a Spark
Operator (https://github.com/GoogleCloudPlatform/spark-on-k8s-operator) and
an extra REST Service. When people submit Spark application to that REST
Service, the REST Service will create a CRD inside the Kubernetes cluster.
Then Spark Operator will pick up the CRD and launch the Spark application.
The one click tool intends to hide these details, so people could just
submit Spark and do not need to deal with too many deployment details.

On Tue, Feb 22, 2022 at 8:09 PM Bitfox <bi...@bitfox.top> wrote:

> Can it be a cluster installation of spark? or just the standalone node?
>
> Thanks
>
> On Wed, Feb 23, 2022 at 12:06 PM bo yang <bo...@gmail.com> wrote:
>
>> Hi Spark Community,
>>
>> We built an open source tool to deploy and run Spark on Kubernetes with a
>> one click command. For example, on AWS, it could automatically create an
>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
>> be able to use curl or a CLI tool to submit Spark application. After the
>> deployment, you could also install Uber Remote Shuffle Service to enable
>> Dynamic Allocation on Kuberentes.
>>
>> Anyone interested in using or working together on such a tool?
>>
>> Thanks,
>> Bo
>>
>>

Re: One click to run Spark on Kubernetes

Posted by bo yang <bo...@gmail.com>.
It is not a standalone spark cluster. In some details, it deploys a Spark
Operator (https://github.com/GoogleCloudPlatform/spark-on-k8s-operator) and
an extra REST Service. When people submit Spark application to that REST
Service, the REST Service will create a CRD inside the Kubernetes cluster.
Then Spark Operator will pick up the CRD and launch the Spark application.
The one click tool intends to hide these details, so people could just
submit Spark and do not need to deal with too many deployment details.

On Tue, Feb 22, 2022 at 8:09 PM Bitfox <bi...@bitfox.top> wrote:

> Can it be a cluster installation of spark? or just the standalone node?
>
> Thanks
>
> On Wed, Feb 23, 2022 at 12:06 PM bo yang <bo...@gmail.com> wrote:
>
>> Hi Spark Community,
>>
>> We built an open source tool to deploy and run Spark on Kubernetes with a
>> one click command. For example, on AWS, it could automatically create an
>> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
>> be able to use curl or a CLI tool to submit Spark application. After the
>> deployment, you could also install Uber Remote Shuffle Service to enable
>> Dynamic Allocation on Kuberentes.
>>
>> Anyone interested in using or working together on such a tool?
>>
>> Thanks,
>> Bo
>>
>>

Re: One click to run Spark on Kubernetes

Posted by Bitfox <bi...@bitfox.top>.
Can it be a cluster installation of spark? or just the standalone node?

Thanks

On Wed, Feb 23, 2022 at 12:06 PM bo yang <bo...@gmail.com> wrote:

> Hi Spark Community,
>
> We built an open source tool to deploy and run Spark on Kubernetes with a
> one click command. For example, on AWS, it could automatically create an
> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
> be able to use curl or a CLI tool to submit Spark application. After the
> deployment, you could also install Uber Remote Shuffle Service to enable
> Dynamic Allocation on Kuberentes.
>
> Anyone interested in using or working together on such a tool?
>
> Thanks,
> Bo
>
>

Re: One click to run Spark on Kubernetes

Posted by Mich Talebzadeh <mi...@gmail.com>.
Hi,

There are two distinct actions here; namely Deploy and Run.

Deployment can be done by command line script with autoscaling. In the
newer versions of Kubernnetes you don't even need to specify the node
types, you can leave it to the Kubernetes cluster  to scale up and down and
decide on node type.

The second point is the running spark that you will need to submit.
However, that depends on setting up access permission, use of service
accounts, pulling the correct dockerfiles for the driver and the executors.
Those details add to the complexity.

Thanks



   view my Linkedin profile
<https://www.linkedin.com/in/mich-talebzadeh-ph-d-5205b2/>


 https://en.everybodywiki.com/Mich_Talebzadeh



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Wed, 23 Feb 2022 at 04:06, bo yang <bo...@gmail.com> wrote:

> Hi Spark Community,
>
> We built an open source tool to deploy and run Spark on Kubernetes with a
> one click command. For example, on AWS, it could automatically create an
> EKS cluster, node group, NGINX ingress, and Spark Operator. Then you will
> be able to use curl or a CLI tool to submit Spark application. After the
> deployment, you could also install Uber Remote Shuffle Service to enable
> Dynamic Allocation on Kuberentes.
>
> Anyone interested in using or working together on such a tool?
>
> Thanks,
> Bo
>
>