You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Scott Taylor (JIRA)" <ji...@apache.org> on 2015/05/19 17:25:00 UTC
[jira] [Created] (SPARK-7735) Raise Exception on non-zero exit from
pyspark pipe commands
Scott Taylor created SPARK-7735:
-----------------------------------
Summary: Raise Exception on non-zero exit from pyspark pipe commands
Key: SPARK-7735
URL: https://issues.apache.org/jira/browse/SPARK-7735
Project: Spark
Issue Type: Bug
Components: PySpark
Affects Versions: 1.3.1, 1.3.0
Reporter: Scott Taylor
Priority: Minor
In pyspark errors are ignored when using the rdd.pipe function. This is different to the scala behaviour where abnormal exit of the piped command is raised. I have submitted a pull request on github which I believe will bring the pyspark behaviour closer to the scala behaviour.
A simple case of where this bug may be problematic is using a network bash utility to perform computations on an rdd. Currently, network errors will be ignored and blank results returned when it would be more desirable to raise an exception so that spark can retry the failed task.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org