You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Nipun Arora <ni...@gmail.com> on 2017/05/17 20:47:03 UTC

Spark Launch programatically - Basics!

Hi,

I am trying to get a simple spark application to run programatically. I
looked at
http://spark.apache.org/docs/2.1.0/api/java/index.html?org/apache/spark/launcher/package-summary.html,
at the following code.

   public class MyLauncher {
     public static void main(String[] args) throws Exception {
       SparkAppHandle handle = new SparkLauncher()
         .setAppResource("/my/app.jar")
         .setMainClass("my.spark.app.Main")
         .setMaster("local")
         .setConf(SparkLauncher.DRIVER_MEMORY, "2g")
         .startApplication();
       // Use handle API to monitor / control application.
     }
   }


I don't have any errors in running this for my application, but I am
running spark in local mode and the launcher class immediately exits after
executing this function. Are we supposed to wait for the process state etc.

Is there a more detailed example of how to monitor inputstreams etc. any
github link or blogpost would help.

Thanks
Nipun

Re: Spark Launch programatically - Basics!

Posted by vimal dinakaran <vi...@gmail.com>.
We are using the below code for for integration test. You need to wait for
the process state.
.startApplication(
        new Listener {
          override def infoChanged(handle: SparkAppHandle): Unit = {
            println("******* info changed ***** ", handle.getAppId,
handle.getState)
          }

          override def stateChanged(handle: SparkAppHandle): Unit = {
            println("*********** state changed *********", handle.getAppId,
handle.getState)
          }
        })

    // Initial state goes to unknown
    // To avoid the UNKNOWN state check below.
    Thread.sleep(10000);

def waitTillComplete(handler: SparkAppHandle): Unit = {
    while (!handler.getState.isFinal && handler.getState !=
SparkAppHandle.State.UNKNOWN) {
      println("State :%s".format(handler.getState()))
      Thread.sleep(5000)
    }
  }

On Thu, May 18, 2017 at 2:17 AM, Nipun Arora <ni...@gmail.com>
wrote:

> Hi,
>
> I am trying to get a simple spark application to run programatically. I
> looked at http://spark.apache.org/docs/2.1.0/api/java/index.
> html?org/apache/spark/launcher/package-summary.html, at the following
> code.
>
>    public class MyLauncher {
>      public static void main(String[] args) throws Exception {
>        SparkAppHandle handle = new SparkLauncher()
>          .setAppResource("/my/app.jar")
>          .setMainClass("my.spark.app.Main")
>          .setMaster("local")
>          .setConf(SparkLauncher.DRIVER_MEMORY, "2g")
>          .startApplication();
>        // Use handle API to monitor / control application.
>      }
>    }
>
>
> I don't have any errors in running this for my application, but I am
> running spark in local mode and the launcher class immediately exits after
> executing this function. Are we supposed to wait for the process state etc.
>
> Is there a more detailed example of how to monitor inputstreams etc. any
> github link or blogpost would help.
>
> Thanks
> Nipun
>