You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Henry Yu (JIRA)" <ji...@apache.org> on 2019/06/10 05:48:00 UTC
[jira] [Commented] (SPARK-27697) KubernetesClientApplication alway
exit with 0
[ https://issues.apache.org/jira/browse/SPARK-27697?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16859717#comment-16859717 ]
Henry Yu commented on SPARK-27697:
----------------------------------
@[~dongjoon] I fix it by adding a pod phase judgement . If driver pod exit without succeeded , I will throw a sparkException in org.apache.spark.deploy.k8s.submit.LoggingPodStatusWatcherImpl#awaitCompletion
> KubernetesClientApplication alway exit with 0
> ---------------------------------------------
>
> Key: SPARK-27697
> URL: https://issues.apache.org/jira/browse/SPARK-27697
> Project: Spark
> Issue Type: Bug
> Components: Kubernetes
> Affects Versions: 2.4.0
> Reporter: Henry Yu
> Priority: Minor
>
> When submit spark job to k8s, workflows try to get job status by submission process exit code.
> yarnClient will throw sparkExceptions when application failed.
> I have fix this in out home maintained spark version. I can make a pr on this issue.
>
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org