You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "xinzhang (JIRA)" <ji...@apache.org> on 2017/10/11 08:05:00 UTC
[jira] [Resolved] (SPARK-22244) sparksql successed on yarn but only
successed some pieces of all jobs
[ https://issues.apache.org/jira/browse/SPARK-22244?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
xinzhang resolved SPARK-22244.
------------------------------
Resolution: Not A Problem
It caused by the client session closed
> sparksql successed on yarn but only successed some pieces of all jobs
> ---------------------------------------------------------------------
>
> Key: SPARK-22244
> URL: https://issues.apache.org/jira/browse/SPARK-22244
> Project: Spark
> Issue Type: Bug
> Components: Spark Core, Spark Shell
> Affects Versions: 2.1.0
> Reporter: xinzhang
>
> Shell : /opt/spark/spark-bin/bin/spark-sql --master yarn --queue `id -g -n` --jars /opt/spark/spark-bin/jars/hive-udf-sw.jar -f /opt/app/scheduler-tomcat/temp/10945011_spark_dwd_user_url_detail_d5.sql >> /data1/tools/logs/etl_log/2017-10-11/10945011.log 2>&1
> Describe:
> It's very weird. Some pics show the strange phenomenon。
> On yarn , the application's status show SUCCEEDED :
> !https://user-images.githubusercontent.com/8244097/31427768-0505f3b0-ae9b-11e7-9a52-557b6259e030.png!
> *{color:red}On Spark History Web, the application has moved into it. But in fact it did not comple all the jobs . The active jobs should be compled. The detail shows :
> {color}*
> !https://user-images.githubusercontent.com/8244097/31427786-1a2c6f6c-ae9b-11e7-8560-555d81271d8b.png!
> !https://user-images.githubusercontent.com/8244097/31428411-33c80056-ae9d-11e7-9a7e-35169d472a86.png!
> the log stopped :
> !https://user-images.githubusercontent.com/8244097/31428025-f319b6f4-ae9b-11e7-8780-e88e75bca14f.png!
> *{color:red}what's the bug? how should i track the pro? any suggests will helpful.{color}*
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org