You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kousuke Saruta (JIRA)" <ji...@apache.org> on 2014/08/19 13:33:18 UTC
[jira] [Updated] (SPARK-3106) *Race Condition Issue* Fix the order
of closing resources when Connection is closed
[ https://issues.apache.org/jira/browse/SPARK-3106?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Kousuke Saruta updated SPARK-3106:
----------------------------------
Description:
Now, when we run Spark application, error message is appear on driver's log.
The error message includes like as follows.
* message caused by ClosedChannelException
* message caused by CancelledKeyException
* "Corresponding SendingConnectionManagerId not found"
Those are mainly caused by the race condition issue of the time Connection is closed.
was:
Now, when we run Spark application, error message is appear on driver's log.
The error message includes like as follows.
* message caused by ClosedChannelException
* message caused by CancelledKeyException
* "Corresponding SendingConnectionManagerId not found"
Those are mainly caused by the race condition issue in
> *Race Condition Issue* Fix the order of closing resources when Connection is closed
> -----------------------------------------------------------------------------------
>
> Key: SPARK-3106
> URL: https://issues.apache.org/jira/browse/SPARK-3106
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.1.0
> Reporter: Kousuke Saruta
>
> Now, when we run Spark application, error message is appear on driver's log.
> The error message includes like as follows.
> * message caused by ClosedChannelException
> * message caused by CancelledKeyException
> * "Corresponding SendingConnectionManagerId not found"
> Those are mainly caused by the race condition issue of the time Connection is closed.
--
This message was sent by Atlassian JIRA
(v6.2#6252)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org