You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by yaoxin <ya...@gmail.com> on 2014/06/03 15:52:16 UTC

Prepare spark executor

Hi

Is there any way to prepare spark executor? Like what we do in MapReduce, we
implements a setup and a clearup method.

For my case, I need this prepare method to init StaticParser base on the
env(dev, production). Then, I can directly use this StaticParser on
executor. like this

    object StaticParser {
        init(env)
        parse()
    }

    sc.setPrepareExecutorMethod(StaticParser.init(dev))

    rdd.map(StaticParser.parse)

Now I can find to ways:
    1. Prepare the executor before any action are taken. The drawback is if
we add new executors, following job will fail.
            sc.parallelize(Seq(1 to 10000), 10000).foreach(setupExecutor)

    2. Change StaticParser to a class, and make an instance then ship it
from driver to executors.
            val shippedParser = new StaticParser().init(dev)
            rdd.map(shippedParser.parse)






--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Prepare-spark-executor-tp6804.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.