You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Hagai <ha...@akamai.com> on 2016/09/12 15:44:53 UTC

Debugging a spark application in a none lazy mode

Hi guys,
Lately I was looking for a way to debug my spark application locally.

However, since all transformations are actually being executed when the
action is encountered, I have no way to look at the data after each
transformation. Does spark support a non-lazy mode which enables to execute
the transformations locally after each statement?

Thanks,
Hagai.




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Debugging-a-spark-application-in-a-none-lazy-mode-tp27695.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Debugging a spark application in a none lazy mode

Posted by Takeshi Yamamuro <li...@gmail.com>.
istm what you can only do is inject `collect` methods map-by-map like;

`df.map(x => do something...).collect`  // check intermediate results in
maps

This only works for small datasets though.

// maropu

On Tue, Sep 13, 2016 at 1:38 AM, Attias, Hagai <ha...@akamai.com> wrote:

> Hi,
>
> Not sure what you mean, can you give an example?
>
>
>
> Hagai.
>
>
>
> *From: *Takeshi Yamamuro <li...@gmail.com>
> *Date: *Monday, September 12, 2016 at 7:24 PM
> *To: *Hagai Attias <ha...@akamai.com>
> *Cc: *"user@spark.apache.org" <us...@spark.apache.org>
> *Subject: *Re: Debugging a spark application in a none lazy mode
>
>
>
> Hi,
>
>
>
> Spark does not have such mode.
>
> How about getting local arrays by `collect` methods for debugging?
>
>
>
> // maropu
>
>
>
> On Tue, Sep 13, 2016 at 12:44 AM, Hagai <ha...@akamai.com> wrote:
>
> Hi guys,
> Lately I was looking for a way to debug my spark application locally.
>
> However, since all transformations are actually being executed when the
> action is encountered, I have no way to look at the data after each
> transformation. Does spark support a non-lazy mode which enables to execute
> the transformations locally after each statement?
>
> Thanks,
> Hagai.
>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.
> 1001560.n3.nabble.com/Debugging-a-spark-application-
> in-a-none-lazy-mode-tp27695.html
> <https://urldefense.proofpoint.com/v2/url?u=http-3A__apache-2Dspark-2Duser-2Dlist.1001560.n3.nabble.com_Debugging-2Da-2Dspark-2Dapplication-2Din-2Da-2Dnone-2Dlazy-2Dmode-2Dtp27695.html&d=DQMFaQ&c=96ZbZZcaMF4w0F4jpN6LZg&r=2rse0GbSQg7RYcSn-6rlDlqzz-dO7WIUHmH78VamBHk&m=R2km6aj_sbZAZeikZA4yYw9K04cSrc6PmYXoq-Hvanc&s=NUDoEmxnN6I8u1TtEZb0xdTvseGduRGZANbgUO47FPE&e=>
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>
>
>
>
> --
>
> ---
> Takeshi Yamamuro
>



-- 
---
Takeshi Yamamuro

Re: Debugging a spark application in a none lazy mode

Posted by "Attias, Hagai" <ha...@akamai.com>.
Hi,
Not sure what you mean, can you give an example?

Hagai.

From: Takeshi Yamamuro <li...@gmail.com>
Date: Monday, September 12, 2016 at 7:24 PM
To: Hagai Attias <ha...@akamai.com>
Cc: "user@spark.apache.org" <us...@spark.apache.org>
Subject: Re: Debugging a spark application in a none lazy mode

Hi,

Spark does not have such mode.
How about getting local arrays by `collect` methods for debugging?

// maropu

On Tue, Sep 13, 2016 at 12:44 AM, Hagai <ha...@akamai.com>> wrote:
Hi guys,
Lately I was looking for a way to debug my spark application locally.

However, since all transformations are actually being executed when the
action is encountered, I have no way to look at the data after each
transformation. Does spark support a non-lazy mode which enables to execute
the transformations locally after each statement?

Thanks,
Hagai.




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Debugging-a-spark-application-in-a-none-lazy-mode-tp27695.html<https://urldefense.proofpoint.com/v2/url?u=http-3A__apache-2Dspark-2Duser-2Dlist.1001560.n3.nabble.com_Debugging-2Da-2Dspark-2Dapplication-2Din-2Da-2Dnone-2Dlazy-2Dmode-2Dtp27695.html&d=DQMFaQ&c=96ZbZZcaMF4w0F4jpN6LZg&r=2rse0GbSQg7RYcSn-6rlDlqzz-dO7WIUHmH78VamBHk&m=R2km6aj_sbZAZeikZA4yYw9K04cSrc6PmYXoq-Hvanc&s=NUDoEmxnN6I8u1TtEZb0xdTvseGduRGZANbgUO47FPE&e=>
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org<ma...@spark.apache.org>



--
---
Takeshi Yamamuro

Re: Debugging a spark application in a none lazy mode

Posted by Takeshi Yamamuro <li...@gmail.com>.
Hi,

Spark does not have such mode.
How about getting local arrays by `collect` methods for debugging?

// maropu

On Tue, Sep 13, 2016 at 12:44 AM, Hagai <ha...@akamai.com> wrote:

> Hi guys,
> Lately I was looking for a way to debug my spark application locally.
>
> However, since all transformations are actually being executed when the
> action is encountered, I have no way to look at the data after each
> transformation. Does spark support a non-lazy mode which enables to execute
> the transformations locally after each statement?
>
> Thanks,
> Hagai.
>
>
>
>
> --
> View this message in context: http://apache-spark-user-list.
> 1001560.n3.nabble.com/Debugging-a-spark-application-
> in-a-none-lazy-mode-tp27695.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>


-- 
---
Takeshi Yamamuro