You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@livy.apache.org by Saisai Shao <sa...@gmail.com> on 2017/08/11 06:55:13 UTC

Re: Propagating pyspark errors to Livy

I think you should check Spark application log to see the details, it is
hard for Livy to get actual error from Spark.

On Fri, Aug 11, 2017 at 12:03 PM, Vivek <vi...@yahoo.co.in> wrote:

> Hi,
>
> Is there anyway to propagate errors from pyspark back to the calling
> program via Livy?
> Currently the Livy logs only tells me that the batch job has failed. How
> do I get the actual error on the spark side.
>
> Regards
> Vivek
>
>
> Sent from my iPhone
>

Re: Propagating pyspark errors to Livy

Posted by Bikas Saha <bi...@apache.org>.
That seems the best options as of now since the URL can be used to access the actual Spark UI and that should have the most information.


Bikas

________________________________
From: Vivek Suvarna <vi...@gmail.com>
Sent: Friday, August 11, 2017 3:20:46 AM
To: user@livy.incubator.apache.org
Subject: Re: Propagating pyspark errors to Livy

I'm currently getting the tracking url and giving it back to the calling program.
Guess that's the best option then.

Thanks


Sent from my iPhone

On 11 Aug 2017, at 2:55 PM, Saisai Shao <sa...@gmail.com>> wrote:

I think you should check Spark application log to see the details, it is hard for Livy to get actual error from Spark.

On Fri, Aug 11, 2017 at 12:03 PM, Vivek <vi...@yahoo.co.in>> wrote:
Hi,

Is there anyway to propagate errors from pyspark back to the calling program via Livy?
Currently the Livy logs only tells me that the batch job has failed. How do I get the actual error on the spark side.

Regards
Vivek


Sent from my iPhone


Re: Propagating pyspark errors to Livy

Posted by Vivek Suvarna <vi...@gmail.com>.
I'm currently getting the tracking url and giving it back to the calling program. 
Guess that's the best option then. 

Thanks


Sent from my iPhone

> On 11 Aug 2017, at 2:55 PM, Saisai Shao <sa...@gmail.com> wrote:
> 
> I think you should check Spark application log to see the details, it is hard for Livy to get actual error from Spark.
> 
>> On Fri, Aug 11, 2017 at 12:03 PM, Vivek <vi...@yahoo.co.in> wrote:
>> Hi,
>> 
>> Is there anyway to propagate errors from pyspark back to the calling program via Livy?
>> Currently the Livy logs only tells me that the batch job has failed. How do I get the actual error on the spark side.
>> 
>> Regards
>> Vivek
>> 
>> 
>> Sent from my iPhone
>