You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by LinQili <li...@outlook.com> on 2014/12/05 09:55:37 UTC

Issue on [SPARK-3877][YARN]: Return code of the spark-submit in yarn-cluster mode

Hi, all:
According to https://github.com/apache/spark/pull/2732, When a spark job fails or exits nonzero in yarn-cluster mode, the spark-submit will get the corresponding return code of the spark job. But I tried in spark-1.1.1 yarn cluster, spark-submit return zero anyway.
Here is my spark code:
    try {      val dropTable = s"drop table $DB.$tableName"      hiveContext.hql(dropTable)      val createTbl =  do some thing...      hiveContext.hql(createTbl)    } catch {      case ex: Exception => {        Util.printLog("ERROR", s"create db error.")        exit(-1)      }    }
Maybe I did something wrong. Is there any hint? Thanks. 		 	   		  

Re: Issue on [SPARK-3877][YARN]: Return code of the spark-submit in yarn-cluster mode

Posted by Shixiong Zhu <zs...@gmail.com>.
There were two exit in this code. If the args was wrong, the spark-submit
will get the return code 101, but, if the args is correct, spark-submit
cannot get the second return code 100. What’s the difference between these
two exit? I was so confused.

I’m also confused. When I tried your codes, spark-submit returned 1 for
both two cases. That’s expected. In the yarn-cluster mode, the driver runs
in the ApplicationMaster. The exit code of driver is also the exit code of
ApplicationMaster. However, for now, Spark cannot get the exit code of
ApplicationMaster from Yarn, because Yarn does not send it back to the
client.
spark-submit will return 1 when Yarn reports the ApplicationMaster failed.
​

Best Regards,
Shixiong Zhu

2014-12-06 1:59 GMT+08:00 LinQili <li...@outlook.com>:

> You mean the localhost:4040 or the application master web ui?
>
> Sent from my iPhone
>
> On Dec 5, 2014, at 17:26, Shixiong Zhu <zs...@gmail.com> wrote:
>
> What's the status of this application in the yarn web UI?
>
> Best Regards,
> Shixiong Zhu
>
> 2014-12-05 17:22 GMT+08:00 LinQili <li...@outlook.com>:
>
>> I tried anather test code:
>>  def main(args: Array[String]) {
>>     if (args.length != 1) {
>>       Util.printLog("ERROR", "Args error - arg1: BASE_DIR")
>>       exit(101)
>>     }
>>     val currentFile = args(0).toString
>>     val DB = "test_spark"
>>     val tableName = "src"
>>
>>     val sparkConf = new SparkConf().setAppName(s"HiveFromSpark")
>>     val sc = new SparkContext(sparkConf)
>>     val hiveContext = new HiveContext(sc)
>>
>>     // Before exit
>>     Util.printLog("INFO", "Exit")
>>     exit(100)
>> }
>>
>> There were two `exit` in this code. If the args was wrong, the
>> spark-submit will get the return code 101, but, if the args is correct,
>> spark-submit cannot get the second return code 100.  What's the difference
>> between these two `exit`? I was so confused.
>>
>> ------------------------------
>> From: lin_qili@outlook.com
>> To: user@spark.incubator.apache.org
>> Subject: RE: Issue on [SPARK-3877][YARN]: Return code of the spark-submit
>> in yarn-cluster mode
>> Date: Fri, 5 Dec 2014 17:11:39 +0800
>>
>>
>> I tried in spark client mode, spark-submit can get the correct return
>> code from spark job. But in yarn-cluster mode, It failed.
>>
>> ------------------------------
>> From: lin_qili@outlook.com
>> To: user@spark.incubator.apache.org
>> Subject: Issue on [SPARK-3877][YARN]: Return code of the spark-submit in
>> yarn-cluster mode
>> Date: Fri, 5 Dec 2014 16:55:37 +0800
>>
>> Hi, all:
>>
>> According to https://github.com/apache/spark/pull/2732, When a spark job
>> fails or exits nonzero in yarn-cluster mode, the spark-submit will get the
>> corresponding return code of the spark job. But I tried in spark-1.1.1 yarn
>> cluster, spark-submit return zero anyway.
>>
>> Here is my spark code:
>>
>>     try {
>>       val dropTable = s"drop table $DB.$tableName"
>>       hiveContext.hql(dropTable)
>>       val createTbl =  do some thing...
>>       hiveContext.hql(createTbl)
>>     } catch {
>>       case ex: Exception => {
>>         Util.printLog("ERROR", s"create db error.")
>>         exit(-1)
>>       }
>>     }
>>
>> Maybe I did something wrong. Is there any hint? Thanks.
>>
>
>

Re: Issue on [SPARK-3877][YARN]: Return code of the spark-submit in yarn-cluster mode

Posted by Shixiong Zhu <zs...@gmail.com>.
What's the status of this application in the yarn web UI?

Best Regards,
Shixiong Zhu

2014-12-05 17:22 GMT+08:00 LinQili <li...@outlook.com>:

> I tried anather test code:
>  def main(args: Array[String]) {
>     if (args.length != 1) {
>       Util.printLog("ERROR", "Args error - arg1: BASE_DIR")
>       exit(101)
>     }
>     val currentFile = args(0).toString
>     val DB = "test_spark"
>     val tableName = "src"
>
>     val sparkConf = new SparkConf().setAppName(s"HiveFromSpark")
>     val sc = new SparkContext(sparkConf)
>     val hiveContext = new HiveContext(sc)
>
>     // Before exit
>     Util.printLog("INFO", "Exit")
>     exit(100)
> }
>
> There were two `exit` in this code. If the args was wrong, the
> spark-submit will get the return code 101, but, if the args is correct,
> spark-submit cannot get the second return code 100.  What's the difference
> between these two `exit`? I was so confused.
>
> ------------------------------
> From: lin_qili@outlook.com
> To: user@spark.incubator.apache.org
> Subject: RE: Issue on [SPARK-3877][YARN]: Return code of the spark-submit
> in yarn-cluster mode
> Date: Fri, 5 Dec 2014 17:11:39 +0800
>
>
> I tried in spark client mode, spark-submit can get the correct return code
> from spark job. But in yarn-cluster mode, It failed.
>
> ------------------------------
> From: lin_qili@outlook.com
> To: user@spark.incubator.apache.org
> Subject: Issue on [SPARK-3877][YARN]: Return code of the spark-submit in
> yarn-cluster mode
> Date: Fri, 5 Dec 2014 16:55:37 +0800
>
> Hi, all:
>
> According to https://github.com/apache/spark/pull/2732, When a spark job
> fails or exits nonzero in yarn-cluster mode, the spark-submit will get the
> corresponding return code of the spark job. But I tried in spark-1.1.1 yarn
> cluster, spark-submit return zero anyway.
>
> Here is my spark code:
>
>     try {
>       val dropTable = s"drop table $DB.$tableName"
>       hiveContext.hql(dropTable)
>       val createTbl =  do some thing...
>       hiveContext.hql(createTbl)
>     } catch {
>       case ex: Exception => {
>         Util.printLog("ERROR", s"create db error.")
>         exit(-1)
>       }
>     }
>
> Maybe I did something wrong. Is there any hint? Thanks.
>

RE: Issue on [SPARK-3877][YARN]: Return code of the spark-submit in yarn-cluster mode

Posted by LinQili <li...@outlook.com>.
I tried anather test code: def main(args: Array[String]) {    if (args.length != 1) {      Util.printLog("ERROR", "Args error - arg1: BASE_DIR")      exit(101)     }    val currentFile = args(0).toString    val DB = "test_spark"    val tableName = "src"
    val sparkConf = new SparkConf().setAppName(s"HiveFromSpark")    val sc = new SparkContext(sparkConf)    val hiveContext = new HiveContext(sc)
    // Before exit    Util.printLog("INFO", "Exit")    exit(100)}
There were two `exit` in this code. If the args was wrong, the spark-submit will get the return code 101, but, if the args is correct, spark-submit cannot get the second return code 100.  What's the difference between these two `exit`? I was so confused.
From: lin_qili@outlook.com
To: user@spark.incubator.apache.org
Subject: RE: Issue on [SPARK-3877][YARN]: Return code of the spark-submit in yarn-cluster mode
Date: Fri, 5 Dec 2014 17:11:39 +0800




I tried in spark client mode, spark-submit can get the correct return code from spark job. But in yarn-cluster mode, It failed.

From: lin_qili@outlook.com
To: user@spark.incubator.apache.org
Subject: Issue on [SPARK-3877][YARN]: Return code of the spark-submit in yarn-cluster mode
Date: Fri, 5 Dec 2014 16:55:37 +0800




Hi, all:
According to https://github.com/apache/spark/pull/2732, When a spark job fails or exits nonzero in yarn-cluster mode, the spark-submit will get the corresponding return code of the spark job. But I tried in spark-1.1.1 yarn cluster, spark-submit return zero anyway.
Here is my spark code:
    try {      val dropTable = s"drop table $DB.$tableName"      hiveContext.hql(dropTable)      val createTbl =  do some thing...      hiveContext.hql(createTbl)    } catch {      case ex: Exception => {        Util.printLog("ERROR", s"create db error.")        exit(-1)      }    }
Maybe I did something wrong. Is there any hint? Thanks. 		 	   		   		 	   		   		 	   		  

RE: Issue on [SPARK-3877][YARN]: Return code of the spark-submit in yarn-cluster mode

Posted by LinQili <li...@outlook.com>.
I tried in spark client mode, spark-submit can get the correct return code from spark job. But in yarn-cluster mode, It failed.

From: lin_qili@outlook.com
To: user@spark.incubator.apache.org
Subject: Issue on [SPARK-3877][YARN]: Return code of the spark-submit in yarn-cluster mode
Date: Fri, 5 Dec 2014 16:55:37 +0800




Hi, all:
According to https://github.com/apache/spark/pull/2732, When a spark job fails or exits nonzero in yarn-cluster mode, the spark-submit will get the corresponding return code of the spark job. But I tried in spark-1.1.1 yarn cluster, spark-submit return zero anyway.
Here is my spark code:
    try {      val dropTable = s"drop table $DB.$tableName"      hiveContext.hql(dropTable)      val createTbl =  do some thing...      hiveContext.hql(createTbl)    } catch {      case ex: Exception => {        Util.printLog("ERROR", s"create db error.")        exit(-1)      }    }
Maybe I did something wrong. Is there any hint? Thanks.