You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Liang-Chi Hsieh (JIRA)" <ji...@apache.org> on 2016/01/25 08:17:39 UTC
[jira] [Comment Edited] (SPARK-4878) driverPropsFetcher causes
spurious Akka disassociate errors
[ https://issues.apache.org/jira/browse/SPARK-4878?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15114861#comment-15114861 ]
Liang-Chi Hsieh edited comment on SPARK-4878 at 1/25/16 7:17 AM:
-----------------------------------------------------------------
I think it is still alive and used. The above code sends the message {{RetrieveSparkProps}} to {{CoarseGrainedSchedulerBackend.DriverEndpoint}} and receives spark properties back.
was (Author: viirya):
I think it is still alive and used. The above code sends the message {code}RetrieveSparkProps{code} to {code}CoarseGrainedSchedulerBackend.DriverEndpoint{code} and receives spark properties back.
> driverPropsFetcher causes spurious Akka disassociate errors
> -----------------------------------------------------------
>
> Key: SPARK-4878
> URL: https://issues.apache.org/jira/browse/SPARK-4878
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.2.0
> Reporter: Stephen Haberman
> Priority: Minor
>
> The dedicated Akka system to fetching driver properties seems fine, but it leads to very misleading "AssociationHandle$Disassociated", dead letter, etc. sort of messages that can lead the user to believe something is wrong with the cluster.
> (E.g. personally I thought it was a Spark -rc1/-rc2 bug and spent awhile poking around until I saw in the code that driverPropsFetcher is purposefully/immediately shutdown.)
> Is there any way to cleanly shutdown that initial akka system so that the driver doesn't log these errors?
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org