You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/05/07 11:08:59 UTC
[jira] [Commented] (SPARK-7436) Cannot implement nor use custom
StandaloneRecoveryModeFactory implementations
[ https://issues.apache.org/jira/browse/SPARK-7436?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14532295#comment-14532295 ]
Apache Spark commented on SPARK-7436:
-------------------------------------
User 'jacek-lewandowski' has created a pull request for this issue:
https://github.com/apache/spark/pull/5975
> Cannot implement nor use custom StandaloneRecoveryModeFactory implementations
> -----------------------------------------------------------------------------
>
> Key: SPARK-7436
> URL: https://issues.apache.org/jira/browse/SPARK-7436
> Project: Spark
> Issue Type: Bug
> Affects Versions: 1.3.1
> Reporter: Jacek Lewandowski
>
> At least, this code fragment is buggy ({{Master.scala}}):
> {code}
> case "CUSTOM" =>
> val clazz = Class.forName(conf.get("spark.deploy.recoveryMode.factory"))
> val factory = clazz.getConstructor(conf.getClass, Serialization.getClass)
> .newInstance(conf, SerializationExtension(context.system))
> .asInstanceOf[StandaloneRecoveryModeFactory]
> (factory.createPersistenceEngine(), factory.createLeaderElectionAgent(this))
> {code}
> Because here: {{val factory = clazz.getConstructor(conf.getClass, Serialization.getClass)}} it tries to find the constructor which accepts {{org.apache.spark.SparkConf}} and class of companion object of {{akka.serialization.Serialization}} and then it tries to instantiate {{newInstance(conf, SerializationExtension(context.system))}} with instance of {{SparkConf}} and instance of {{Serialization}} class - not the companion objects.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org