You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@griffin.apache.org by Lionel Liu <li...@apache.org> on 2018/05/01 07:33:32 UTC

Re: [DISCUSS] Support CLI launch for spark jobs

I agree that users should be blocked if they don't install livy, livy is
easy for spark job submitting, but we also need some other approach like
submitting in shell way or using APIs of yarn.

However, there're more spikes to adapt to different spark clusters, like
yarn or mesos, which is not the problem we want to solve in data quality
domain.

In my opinion, livy focuses on connecting to spark cluster and works well,
it could be the default solution.
We can also provide a simple way to submit spark jobs, for the basic
functions only, to support the specific environment, that'll be enough. If
we want to get job state, retry the job if failed, it will cost more
efforts.

Thanks,
Lionel

On Mon, Apr 30, 2018 at 10:47 PM, William Guo <gu...@apache.org> wrote:

> hi all,
>
> In some cluster environment, administrator doesn't install livy in their
> cluster.
>
> How can we support these users, should we support it by launching spark
> jobs by CLI?
>
>
> What are your comments for this issue?
>
>
> Thanks,
> William
>