You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Niranda Perera <ni...@gmail.com> on 2015/03/03 11:43:13 UTC

Deploying master and worker programatically in java

Hi,

I want to start a Spark standalone cluster programatically in java.

I have been checking these classes,
- org.apache.spark.deploy.master.Master
- org.apache.spark.deploy.worker.Worker

I successfully started a master with this simple main class.

 public static void main(String[] args) {
        SparkConf conf = new SparkConf();
        Master.startSystemAndActor("localhost", 4500, 8080, conf);
}


but I'm finding it hard to carry out a similar approach for the worker.

can anyone give an example of how to pass a value to the workerNumber field
in the Worker.startSystemAndActor constructor (in the java env)?

Cheers
-- 
Niranda