You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Imran Rashid (JIRA)" <ji...@apache.org> on 2015/08/26 16:59:45 UTC
[jira] [Updated] (SPARK-10248) DAGSchedulerSuite should check there
were no errors in EventProcessLoop
[ https://issues.apache.org/jira/browse/SPARK-10248?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Imran Rashid updated SPARK-10248:
---------------------------------
Description:
If an exception is thrown inside {{DAGSchedulerEventProcessLoop}}, it is just logged, so its hard to directly check in tests. (In fact, the scheduler isn't even stopped, b/c the tests don't use the {{DAGSchduler}} that is known by the {{SparkContext}}).
We should update the test framework so we can check if there is an error in the event loop.
was:
If an exception is thrown inside {{DAGSchedulerEventProcessLoop}}, it is just logged, so its hard to directly check in tests. (In fact, the scheduler isn't even stopped, b/c we the tests don't use the {{DAGSchduler}} that is known by the {{SparkContext}}).
We should update the test framework so we can check if there is an error in the event loop.
> DAGSchedulerSuite should check there were no errors in EventProcessLoop
> -----------------------------------------------------------------------
>
> Key: SPARK-10248
> URL: https://issues.apache.org/jira/browse/SPARK-10248
> Project: Spark
> Issue Type: Test
> Components: Spark Core
> Affects Versions: 1.5.0
> Reporter: Imran Rashid
>
> If an exception is thrown inside {{DAGSchedulerEventProcessLoop}}, it is just logged, so its hard to directly check in tests. (In fact, the scheduler isn't even stopped, b/c the tests don't use the {{DAGSchduler}} that is known by the {{SparkContext}}).
> We should update the test framework so we can check if there is an error in the event loop.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org