You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "Jozef Vilcek (JIRA)" <ji...@apache.org> on 2019/01/22 21:10:00 UTC
[jira] [Created] (BEAM-6484) Spark's pipeline result throws
exceptions
Jozef Vilcek created BEAM-6484:
----------------------------------
Summary: Spark's pipeline result throws exceptions
Key: BEAM-6484
URL: https://issues.apache.org/jira/browse/BEAM-6484
Project: Beam
Issue Type: Bug
Components: runner-spark
Reporter: Jozef Vilcek
Assignee: Amit Sela
Spark's pipeline result seems to allow throwing Exceptions from `waitUntilFinish()` methods.
[https://github.com/apache/beam/blob/master/runners/spark/src/main/java/org/apache/beam/runners/spark/SparkPipelineResult.java#L101]
Documentation on PipelineResult does not mention this possibility. As a result, when I am running pipelines from Scio, it does swallow such exceptions because if does not expect them
[https://github.com/spotify/scio/issues/1617]
Should the Spark Runner just log those exceptions and update it's state of is documentation about `PipelineResult.waitUntilFinish()` not up to date?
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)