You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by aditya barve <ad...@gmail.com> on 2016/09/07 06:42:58 UTC

SparkStreaming is not working with SparkLauncher

Hello Team,

I am new to spark. I tried to create a sample application and submitted to
spark.

D:\spark-1.6.2-bin-hadoop2.6\bin\spark-submit --class "main.java.SimpleApp"
--master local[4] target/simple-project-1.0-jar-with-dependencies.jar
server1 1234

It worked fine. Here server1 is my netcat server and 1234 is port. I am
able to listen data from netcat server and process it further,

Now I want to check deployment part of spark. I can't submit jar from
command line every time. On little research I found SparkLauncher class. I
tried following code.


  public class AppLauncher {
    public static void main(String[] args) throws Exception {
    Process spark = new SparkLauncher()

.setAppResource("D:\\aditya_barve\\SimpleApp\\target\\simple-project-1.0-jar-with-dependencies.jar")
        .setSparkHome("D:\\spark-1.6.2-bin-hadoop2.6")
        .setMainClass("main.java.SimpleApp")
        .setMaster("local")
        .setConf(SparkLauncher.DRIVER_MEMORY, "2g")
        .addAppArgs("server1","1234")
        .launch();

      System.out.println("Waiting for finish...");
      int exitCode = spark.waitFor();
      System.out.println("Finished! Exit code:" + exitCode);
      // Use handle API to monitor / control application.
    }


}

I am able to start my application successfully but my application is not
listening streaming data from netcat server. It works perfectly when I
manually submitting jar file from command line.

It looks like I am missing something.

 Any help is highly appreciable.

Thanks,
--
Aditya