You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@griffin.apache.org by Vikram Jain <vi...@enquero.com> on 2018/12/14 12:18:19 UTC

Griffin Job Status displayed as FOUND

Hello,

I'm using Apache Griffin which in turn uses Livy to communicate with the Spark Cluster.

While submitting SPARK jobs from UI, there have been few instances where the job status is shown as FOUND. This is intermittent/random and is not permanent/recurring. This means that only a few times the job will go to FOUND state and many a times at the next run instance, the job is submitted and executed.

Can someone here help me understand the reason and system state under which a the job status is FOUND. What are the reasons and factors that make a job go to FOUND state?



Thanks in advance,

Vikram


Re: Griffin Job Status displayed as FOUND

Posted by Kevin Yao <ah...@gmail.com>.
Hi Vikram,
The normal return result submitted to Spark by Livy is similar to '
*{"id":1,"state":"starting","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":[]}*
'. We get result and set the corresponding  job status.

However, if return result  by Livy is null, we'll mark job status as  FOUND
which is probably failed or something wrong happened. FOUND status means that
predicate done file succeeds but there is no return state associated with
it anyway.

Just from the return result, I'm not sure where goes wrong. Could you provide
relevant logs?

Thanks,
Kevin


On Fri, Dec 14, 2018 at 8:18 PM Vikram Jain <vi...@enquero.com> wrote:

> Hello,
>
> I'm using Apache Griffin which in turn uses Livy to communicate with the
> Spark Cluster.
>
> While submitting SPARK jobs from UI, there have been few instances where
> the job status is shown as FOUND. This is intermittent/random and is not
> permanent/recurring. This means that only a few times the job will go to
> FOUND state and many a times at the next run instance, the job is submitted
> and executed.
>
> Can someone here help me understand the reason and system state under
> which a the job status is FOUND. What are the reasons and factors that make
> a job go to FOUND state?
>
>
>
> Thanks in advance,
>
> Vikram
>
>
>

Re: Griffin Job Status displayed as FOUND

Posted by Kevin Yao <ah...@gmail.com>.
Hi Vikram,
The normal return result submitted to Spark by Livy is similar to '
*{"id":1,"state":"starting","appId":null,"appInfo":{"driverLogUrl":null,"sparkUiUrl":null},"log":[]}*
'. We get result and set the corresponding  job status.

However, if return result  by Livy is null, we'll mark job status as  FOUND
which is probably failed or something wrong happened. FOUND status means that
predicate done file succeeds but there is no return state associated with
it anyway.

Just from the return result, I'm not sure where goes wrong. Could you provide
relevant logs?

Thanks,
Kevin


On Fri, Dec 14, 2018 at 8:18 PM Vikram Jain <vi...@enquero.com> wrote:

> Hello,
>
> I'm using Apache Griffin which in turn uses Livy to communicate with the
> Spark Cluster.
>
> While submitting SPARK jobs from UI, there have been few instances where
> the job status is shown as FOUND. This is intermittent/random and is not
> permanent/recurring. This means that only a few times the job will go to
> FOUND state and many a times at the next run instance, the job is submitted
> and executed.
>
> Can someone here help me understand the reason and system state under
> which a the job status is FOUND. What are the reasons and factors that make
> a job go to FOUND state?
>
>
>
> Thanks in advance,
>
> Vikram
>
>
>