You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Shashank Mandil <ma...@gmail.com> on 2017/02/03 19:06:58 UTC

Spark submit on yarn does not return with exit code 1 on exception

Hi All,

I wrote a test script which always throws an exception as below :

object Test {


  def main(args: Array[String]) {
    try {

      val conf =
        new SparkConf()
          .setAppName("Test")

      throw new RuntimeException("Some Exception")

      println("all done!")
    } catch {
      case e: RuntimeException => {
        println("There is an exception in the script exiting with status 1")
        System.exit(1)
      }
    }
}

When I run this code using spark-submit I am expecting to get an exit code
of 1,
however I keep getting exit code 0.

Any ideas how I can force spark-submit to return with code 1 ?

Thanks,
Shashank

Re: Spark submit on yarn does not return with exit code 1 on exception

Posted by Shashank Mandil <ma...@gmail.com>.
I may have found my problem. We have a scala wrapper on top of spark-submit
to run the shell command through scala.
We were kind of eating the exit code from spark-submit in that wrapper.
When I looked at what the actual exit code was stripping away the wrapper I
got 1.

So I think spark-submit is behaving okay in my case.

Thank you for all the help.

Thanks,
Shashank

On Fri, Feb 3, 2017 at 1:56 PM, Jacek Laskowski <ja...@japila.pl> wrote:

> Hi,
>
> ➜  spark git:(master) ✗ ./bin/spark-submit whatever || echo $?
> Error: Cannot load main class from JAR file:/Users/jacek/dev/oss/
> spark/whatever
> Run with --help for usage help or --verbose for debug output
> 1
>
> I see 1 and there are other cases for 1 too.
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 https://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Fri, Feb 3, 2017 at 10:46 PM, Ali Gouta <al...@gmail.com> wrote:
> > Hello,
> >
> > +1, i have exactly the same issue. I need the exit code to make a
> decision
> > on oozie executing actions. Spark-submit always returns 0 when catching
> the
> > exception. From spark 1.5 to 1.6.x, i still have the same issue... It
> would
> > be great to fix it or to know if there is some work around about it.
> >
> > Ali Gouta.
> >
> > Le 3 févr. 2017 22:24, "Jacek Laskowski" <ja...@japila.pl> a écrit :
> >
> > Hi,
> >
> > An interesting case. You don't use Spark resources whatsoever.
> > Creating a SparkConf does not use YARN...yet. I think any run mode
> > would have the same effect. So, although spark-submit could have
> > returned exit code 1, the use case touches Spark very little.
> >
> > What version is that? Do you see "There is an exception in the script
> > exiting with status 1" printed out to stdout?
> >
> > Pozdrawiam,
> > Jacek Laskowski
> > ----
> > https://medium.com/@jaceklaskowski/
> > Mastering Apache Spark 2.0 https://bit.ly/mastering-apache-spark
> > Follow me at https://twitter.com/jaceklaskowski
> >
> >
> > On Fri, Feb 3, 2017 at 8:06 PM, Shashank Mandil
> > <ma...@gmail.com> wrote:
> >> Hi All,
> >>
> >> I wrote a test script which always throws an exception as below :
> >>
> >> object Test {
> >>
> >>
> >>   def main(args: Array[String]) {
> >>     try {
> >>
> >>       val conf =
> >>         new SparkConf()
> >>           .setAppName("Test")
> >>
> >>       throw new RuntimeException("Some Exception")
> >>
> >>       println("all done!")
> >>     } catch {
> >>       case e: RuntimeException => {
> >>         println("There is an exception in the script exiting with status
> >> 1")
> >>         System.exit(1)
> >>       }
> >>     }
> >> }
> >>
> >> When I run this code using spark-submit I am expecting to get an exit
> code
> >> of 1,
> >> however I keep getting exit code 0.
> >>
> >> Any ideas how I can force spark-submit to return with code 1 ?
> >>
> >> Thanks,
> >> Shashank
> >>
> >>
> >>
> >
> > ---------------------------------------------------------------------
> > To unsubscribe e-mail: user-unsubscribe@spark.apache.org
> >
> >
>

Re: Spark submit on yarn does not return with exit code 1 on exception

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

➜  spark git:(master) ✗ ./bin/spark-submit whatever || echo $?
Error: Cannot load main class from JAR file:/Users/jacek/dev/oss/spark/whatever
Run with --help for usage help or --verbose for debug output
1

I see 1 and there are other cases for 1 too.

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 https://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Fri, Feb 3, 2017 at 10:46 PM, Ali Gouta <al...@gmail.com> wrote:
> Hello,
>
> +1, i have exactly the same issue. I need the exit code to make a decision
> on oozie executing actions. Spark-submit always returns 0 when catching the
> exception. From spark 1.5 to 1.6.x, i still have the same issue... It would
> be great to fix it or to know if there is some work around about it.
>
> Ali Gouta.
>
> Le 3 févr. 2017 22:24, "Jacek Laskowski" <ja...@japila.pl> a écrit :
>
> Hi,
>
> An interesting case. You don't use Spark resources whatsoever.
> Creating a SparkConf does not use YARN...yet. I think any run mode
> would have the same effect. So, although spark-submit could have
> returned exit code 1, the use case touches Spark very little.
>
> What version is that? Do you see "There is an exception in the script
> exiting with status 1" printed out to stdout?
>
> Pozdrawiam,
> Jacek Laskowski
> ----
> https://medium.com/@jaceklaskowski/
> Mastering Apache Spark 2.0 https://bit.ly/mastering-apache-spark
> Follow me at https://twitter.com/jaceklaskowski
>
>
> On Fri, Feb 3, 2017 at 8:06 PM, Shashank Mandil
> <ma...@gmail.com> wrote:
>> Hi All,
>>
>> I wrote a test script which always throws an exception as below :
>>
>> object Test {
>>
>>
>>   def main(args: Array[String]) {
>>     try {
>>
>>       val conf =
>>         new SparkConf()
>>           .setAppName("Test")
>>
>>       throw new RuntimeException("Some Exception")
>>
>>       println("all done!")
>>     } catch {
>>       case e: RuntimeException => {
>>         println("There is an exception in the script exiting with status
>> 1")
>>         System.exit(1)
>>       }
>>     }
>> }
>>
>> When I run this code using spark-submit I am expecting to get an exit code
>> of 1,
>> however I keep getting exit code 0.
>>
>> Any ideas how I can force spark-submit to return with code 1 ?
>>
>> Thanks,
>> Shashank
>>
>>
>>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Spark submit on yarn does not return with exit code 1 on exception

Posted by Ali Gouta <al...@gmail.com>.
Hello,

+1, i have exactly the same issue. I need the exit code to make a decision
on oozie executing actions. Spark-submit always returns 0 when catching the
exception. From spark 1.5 to 1.6.x, i still have the same issue... It would
be great to fix it or to know if there is some work around about it.

Ali Gouta.

Le 3 févr. 2017 22:24, "Jacek Laskowski" <ja...@japila.pl> a écrit :

Hi,

An interesting case. You don't use Spark resources whatsoever.
Creating a SparkConf does not use YARN...yet. I think any run mode
would have the same effect. So, although spark-submit could have
returned exit code 1, the use case touches Spark very little.

What version is that? Do you see "There is an exception in the script
exiting with status 1" printed out to stdout?

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 https://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Fri, Feb 3, 2017 at 8:06 PM, Shashank Mandil
<ma...@gmail.com> wrote:
> Hi All,
>
> I wrote a test script which always throws an exception as below :
>
> object Test {
>
>
>   def main(args: Array[String]) {
>     try {
>
>       val conf =
>         new SparkConf()
>           .setAppName("Test")
>
>       throw new RuntimeException("Some Exception")
>
>       println("all done!")
>     } catch {
>       case e: RuntimeException => {
>         println("There is an exception in the script exiting with status
1")
>         System.exit(1)
>       }
>     }
> }
>
> When I run this code using spark-submit I am expecting to get an exit code
> of 1,
> however I keep getting exit code 0.
>
> Any ideas how I can force spark-submit to return with code 1 ?
>
> Thanks,
> Shashank
>
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org

Re: Spark submit on yarn does not return with exit code 1 on exception

Posted by Jacek Laskowski <ja...@japila.pl>.
Hi,

An interesting case. You don't use Spark resources whatsoever.
Creating a SparkConf does not use YARN...yet. I think any run mode
would have the same effect. So, although spark-submit could have
returned exit code 1, the use case touches Spark very little.

What version is that? Do you see "There is an exception in the script
exiting with status 1" printed out to stdout?

Pozdrawiam,
Jacek Laskowski
----
https://medium.com/@jaceklaskowski/
Mastering Apache Spark 2.0 https://bit.ly/mastering-apache-spark
Follow me at https://twitter.com/jaceklaskowski


On Fri, Feb 3, 2017 at 8:06 PM, Shashank Mandil
<ma...@gmail.com> wrote:
> Hi All,
>
> I wrote a test script which always throws an exception as below :
>
> object Test {
>
>
>   def main(args: Array[String]) {
>     try {
>
>       val conf =
>         new SparkConf()
>           .setAppName("Test")
>
>       throw new RuntimeException("Some Exception")
>
>       println("all done!")
>     } catch {
>       case e: RuntimeException => {
>         println("There is an exception in the script exiting with status 1")
>         System.exit(1)
>       }
>     }
> }
>
> When I run this code using spark-submit I am expecting to get an exit code
> of 1,
> however I keep getting exit code 0.
>
> Any ideas how I can force spark-submit to return with code 1 ?
>
> Thanks,
> Shashank
>
>
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org