You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "Beam JIRA Bot (Jira)" <ji...@apache.org> on 2020/08/10 17:07:17 UTC
[jira] [Commented] (BEAM-8983) Spark uber jar job server: query
exceptions from master
[ https://issues.apache.org/jira/browse/BEAM-8983?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17174572#comment-17174572 ]
Beam JIRA Bot commented on BEAM-8983:
-------------------------------------
This issue is P2 but has been unassigned without any comment for 60 days so it has been labeled "stale-P2". If this issue is still affecting you, we care! Please comment and remove the label. Otherwise, in 14 days the issue will be moved to P3.
Please see https://beam.apache.org/contribute/jira-priorities/ for a detailed explanation of what these priorities mean.
> Spark uber jar job server: query exceptions from master
> -------------------------------------------------------
>
> Key: BEAM-8983
> URL: https://issues.apache.org/jira/browse/BEAM-8983
> Project: Beam
> Issue Type: Improvement
> Components: runner-spark
> Reporter: Kyle Weaver
> Priority: P2
> Labels: portability-spark, stale-P2
>
> As far as I know, the Spark REST API does not return exceptions from the cluster after a jar is actually run. While these exceptions can be viewed in Spark's web UI, ideally they would also be visible in Beam's output.
> To do this, we will need to find a REST endpoint that does return those exceptions, and then map the submissionId to its corresponding job id.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)