You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jacek Laskowski <ja...@japila.pl> on 2016/06/18 09:53:29 UTC

How to cause a stage to fail (using spark-shell)?

Hi,

I'm trying to see some stats about failing stages in web UI and want
to "create" few failed stages. Is this possible using spark-shell at
all? Which setup of Spark/spark-shell would allow for such a scenario.

I could write a Scala code if that's the only way to have failing stages.

Please guide. Thanks.

/me on to reviewing the Spark code...

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: How to cause a stage to fail (using spark-shell)?

Posted by Jacek Laskowski <ja...@japila.pl>.
Mind sharing code? I think only shuffle failures lead to stage failures and
re-tries.

Jacek
On 19 Jun 2016 4:35 p.m., "Ted Yu" <yu...@gmail.com> wrote:

> You can utilize a counter in external storage (NoSQL e.g.)
> When the counter reaches 2, stop throwing exception so that the task
> passes.
>
> FYI
>
> On Sun, Jun 19, 2016 at 3:22 AM, Jacek Laskowski <ja...@japila.pl> wrote:
>
>> Hi,
>>
>> Thanks Burak for the idea, but it *only* fails the tasks that
>> eventually fail the entire job not a particular stage (just once or
>> twice) before the entire job is failed. The idea is to see the
>> attempts in web UI as there's a special handling for cases where a
>> stage failed once or twice before finishing up properly.
>>
>> Any ideas? I've got one but it requires quite an extensive cluster set
>> up which I'd like to avoid if possible. Just something I could use
>> during workshops or demos and others could reproduce easily to learn
>> Spark's internals.
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> ----
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Sun, Jun 19, 2016 at 5:25 AM, Burak Yavuz <br...@gmail.com> wrote:
>> > Hi Jacek,
>> >
>> > Can't you simply have a mapPartitions task throw an exception or
>> something?
>> > Are you trying to do something more esoteric?
>> >
>> > Best,
>> > Burak
>> >
>> > On Sat, Jun 18, 2016 at 5:35 AM, Jacek Laskowski <ja...@japila.pl>
>> wrote:
>> >>
>> >> Hi,
>> >>
>> >> Following up on this question, is a stage considered failed only when
>> >> there is a FetchFailed exception? Can I have a failed stage with only
>> >> a single-stage job?
>> >>
>> >> Appreciate any help on this...(as my family doesn't like me spending
>> >> the weekend with Spark :))
>> >>
>> >> Pozdrawiam,
>> >> Jacek Laskowski
>> >> ----
>> >> https://medium.com/@jaceklaskowski/
>> >> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> >> Follow me at https://twitter.com/jaceklaskowski
>> >>
>> >>
>> >> On Sat, Jun 18, 2016 at 11:53 AM, Jacek Laskowski <ja...@japila.pl>
>> wrote:
>> >> > Hi,
>> >> >
>> >> > I'm trying to see some stats about failing stages in web UI and want
>> >> > to "create" few failed stages. Is this possible using spark-shell at
>> >> > all? Which setup of Spark/spark-shell would allow for such a
>> scenario.
>> >> >
>> >> > I could write a Scala code if that's the only way to have failing
>> >> > stages.
>> >> >
>> >> > Please guide. Thanks.
>> >> >
>> >> > /me on to reviewing the Spark code...
>> >> >
>> >> > Pozdrawiam,
>> >> > Jacek Laskowski
>> >> > ----
>> >> > https://medium.com/@jaceklaskowski/
>> >> > Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> >> > Follow me at https://twitter.com/jaceklaskowski
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> >> For additional commands, e-mail: user-help@spark.apache.org
>> >>
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Re: How to cause a stage to fail (using spark-shell)?

Posted by Ted Yu <yu...@gmail.com>.
You can utilize a counter in external storage (NoSQL e.g.)
When the counter reaches 2, stop throwing exception so that the task passes.

FYI

On Sun, Jun 19, 2016 at 3:22 AM, Jacek Laskowski <ja...@japila.pl> wrote:

> Hi,
>
> Thanks Burak for the idea, but it *only* fails the tasks that
> eventually fail the entire job not a particular stage (just once or
> twice) before the entire job is failed. The idea is to see the
> attempts in web UI as there's a special handling for cases where a
> stage failed once or twice before finishing up properly.
>
> Any ideas? I've got one but it requires quite an extensive cluster set
> up which I'd like to avoid if possible. Just something I could use
> during workshops or demos and others could reproduce easily to learn
> Spark's internals.
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sun, Jun 19, 2016 at 5:25 AM, Burak Yavuz <br...@gmail.com> wrote:
> > Hi Jacek,
> >
> > Can't you simply have a mapPartitions task throw an exception or
> something?
> > Are you trying to do something more esoteric?
> >
> > Best,
> > Burak
> >
> > On Sat, Jun 18, 2016 at 5:35 AM, Jacek Laskowski <ja...@japila.pl>
> wrote:
> >>
> >> Hi,
> >>
> >> Following up on this question, is a stage considered failed only when
> >> there is a FetchFailed exception? Can I have a failed stage with only
> >> a single-stage job?
> >>
> >> Appreciate any help on this...(as my family doesn't like me spending
> >> the weekend with Spark :))
> >>
> >> Pozdrawiam,
> >> Jacek Laskowski
> >> ----
> >> https://medium.com/@jaceklaskowski/
> >> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> >> Follow me at https://twitter.com/jaceklaskowski
> >>
> >>
> >> On Sat, Jun 18, 2016 at 11:53 AM, Jacek Laskowski <ja...@japila.pl>
> wrote:
> >> > Hi,
> >> >
> >> > I'm trying to see some stats about failing stages in web UI and want
> >> > to "create" few failed stages. Is this possible using spark-shell at
> >> > all? Which setup of Spark/spark-shell would allow for such a scenario.
> >> >
> >> > I could write a Scala code if that's the only way to have failing
> >> > stages.
> >> >
> >> > Please guide. Thanks.
> >> >
> >> > /me on to reviewing the Spark code...
> >> >
> >> > Pozdrawiam,
> >> > Jacek Laskowski
> >> > ----
> >> > https://medium.com/@jaceklaskowski/
> >> > Mastering Apache Spark http://bit.ly/mastering-apache-spark
> >> > Follow me at https://twitter.com/jaceklaskowski
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> >> For additional commands, e-mail: user-help@spark.apache.org
> >>
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: How to cause a stage to fail (using spark-shell)?

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

Thanks Burak for the idea, but it *only* fails the tasks that
eventually fail the entire job not a particular stage (just once or
twice) before the entire job is failed. The idea is to see the
attempts in web UI as there's a special handling for cases where a
stage failed once or twice before finishing up properly.

Any ideas? I've got one but it requires quite an extensive cluster set
up which I'd like to avoid if possible. Just something I could use
during workshops or demos and others could reproduce easily to learn
Spark's internals.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sun, Jun 19, 2016 at 5:25 AM, Burak Yavuz <br...@gmail.com> wrote:
> Hi Jacek,
>
> Can't you simply have a mapPartitions task throw an exception or something?
> Are you trying to do something more esoteric?
>
> Best,
> Burak
>
> On Sat, Jun 18, 2016 at 5:35 AM, Jacek Laskowski <ja...@japila.pl> wrote:
>>
>> Hi,
>>
>> Following up on this question, is a stage considered failed only when
>> there is a FetchFailed exception? Can I have a failed stage with only
>> a single-stage job?
>>
>> Appreciate any help on this...(as my family doesn't like me spending
>> the weekend with Spark :))
>>
>> Pozdrawiam,
>> Jacek Laskowski
>> ----
>> https://medium.com/@jaceklaskowski/
>> Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> Follow me at https://twitter.com/jaceklaskowski
>>
>>
>> On Sat, Jun 18, 2016 at 11:53 AM, Jacek Laskowski <ja...@japila.pl> wrote:
>> > Hi,
>> >
>> > I'm trying to see some stats about failing stages in web UI and want
>> > to "create" few failed stages. Is this possible using spark-shell at
>> > all? Which setup of Spark/spark-shell would allow for such a scenario.
>> >
>> > I could write a Scala code if that's the only way to have failing
>> > stages.
>> >
>> > Please guide. Thanks.
>> >
>> > /me on to reviewing the Spark code...
>> >
>> > Pozdrawiam,
>> > Jacek Laskowski
>> > ----
>> > https://medium.com/@jaceklaskowski/
>> > Mastering Apache Spark http://bit.ly/mastering-apache-spark
>> > Follow me at https://twitter.com/jaceklaskowski
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: How to cause a stage to fail (using spark-shell)?

Posted by Burak Yavuz <br...@gmail.com>.
Hi Jacek,

Can't you simply have a mapPartitions task throw an exception or something?
Are you trying to do something more esoteric?

Best,
Burak

On Sat, Jun 18, 2016 at 5:35 AM, Jacek Laskowski <ja...@japila.pl> wrote:

> Hi,
>
> Following up on this question, is a stage considered failed only when
> there is a FetchFailed exception? Can I have a failed stage with only
> a single-stage job?
>
> Appreciate any help on this...(as my family doesn't like me spending
> the weekend with Spark :))
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Sat, Jun 18, 2016 at 11:53 AM, Jacek Laskowski <ja...@japila.pl> wrote:
> > Hi,
> >
> > I'm trying to see some stats about failing stages in web UI and want
> > to "create" few failed stages. Is this possible using spark-shell at
> > all? Which setup of Spark/spark-shell would allow for such a scenario.
> >
> > I could write a Scala code if that's the only way to have failing stages.
> >
> > Please guide. Thanks.
> >
> > /me on to reviewing the Spark code...
> >
> > Pozdrawiam,
> > Jacek Laskowski
> > ----
> > https://medium.com/@jaceklaskowski/
> > Mastering Apache Spark http://bit.ly/mastering-apache-spark
> > Follow me at https://twitter.com/jaceklaskowski
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: How to cause a stage to fail (using spark-shell)?

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

Following up on this question, is a stage considered failed only when
there is a FetchFailed exception? Can I have a failed stage with only
a single-stage job?

Appreciate any help on this...(as my family doesn't like me spending
the weekend with Spark :))

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark http://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Sat, Jun 18, 2016 at 11:53 AM, Jacek Laskowski <ja...@japila.pl> wrote:
> Hi,
>
> I'm trying to see some stats about failing stages in web UI and want
> to "create" few failed stages. Is this possible using spark-shell at
> all? Which setup of Spark/spark-shell would allow for such a scenario.
>
> I could write a Scala code if that's the only way to have failing stages.
>
> Please guide. Thanks.
>
> /me on to reviewing the Spark code...
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark http://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org