You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hive.apache.org by "Rui Li (JIRA)" <ji...@apache.org> on 2018/03/12 14:04:00 UTC

[jira] [Commented] (HIVE-18916) SparkClientImpl doesn't error out if spark-submit fails

    [ https://issues.apache.org/jira/browse/HIVE-18916?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16395269#comment-16395269 ] 

Rui Li commented on HIVE-18916:
-------------------------------

[~stakiar], is this issue easy to reproduce? When the thread monitoring {{spark-submit}} finds it returns non-zero, the thread calls {{rpcServer.cancelClient}} which should ideally cancel the waiting for client to connect.

> SparkClientImpl doesn't error out if spark-submit fails
> -------------------------------------------------------
>
>                 Key: HIVE-18916
>                 URL: https://issues.apache.org/jira/browse/HIVE-18916
>             Project: Hive
>          Issue Type: Sub-task
>          Components: Spark
>            Reporter: Sahil Takiar
>            Priority: Major
>
> If {{spark-submit}} returns a non-zero exit code, {{SparkClientImpl}} will simply log the exit code, but won't throw an error. Eventually, the connection timeout will get triggered and an exception like {{Timed out waiting for client connection}} will be logged, which is pretty misleading.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)