You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Arko Provo Mukherjee <ar...@gmail.com> on 2016/02/20 02:56:34 UTC
Submitting Jobs Programmatically
Hello,
I am trying to submit a spark job via a program.
When I run it, I receive the following error:
Exception in thread "Thread-1" java.lang.NoClassDefFoundError:
org/apache/spark/launcher/SparkLauncher
at Spark.SparkConnector.run(MySpark.scala:33)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException:
org.apache.spark.launcher.SparkLauncher
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 2 more
It seems it cannot find the SparkLauncher class. Any clue to what I am
doing wrong?
Thanks & regards
Arko
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
RE: Submitting Jobs Programmatically
Posted by Patrick Mi <pa...@touchpointgroup.com>.
Hi there,
I had similar problem in Java with the standalone cluster on Linux but got
that working by passing the following option
-Dspark.jars=file:/path/to/sparkapp.jar
sparkapp.jar has the launch application
Hope that helps.
Regards,
Patrick
-----Original Message-----
From: Arko Provo Mukherjee [mailto:arkoprovomukherjee@gmail.com]
Sent: Saturday, 20 February 2016 4:27 p.m.
To: Ted Yu
Cc: Holden Karau; user
Subject: Re: Submitting Jobs Programmatically
Hello,
Thanks much. I could start the service.
When I run my program, the launcher is not being able to find the app class:
java.lang.ClassNotFoundException: SparkSubmitter
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
Spark job complete. Exit code:101
at java.lang.Class.forName(Class.java:274)
at org.apache.spark.util.Utils$.classForName(Utils.scala:173)
at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:639)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
My launch code is as follows:
val spark = new SparkLauncher()
.setSparkHome("C:\\spark-1.5.1-bin-hadoop2.6")
.setAppResource("C:\\SparkService\\Scala\\RequestSubmitter\\target\\scala-2.10\\spark-submitter_2.10-0.0.1.jar")
.setMainClass("SparkSubmitter")
.addAppArgs(inputQuery)
.setMaster("spark://157.54.189.70:7077")
.launch()
spark.waitFor()
I added the spark-submitter_2.10-0.0.1.jar in the classpath as well
but that didn't help.
Thanks & regards
Arko
On Fri, Feb 19, 2016 at 6:49 PM, Ted Yu <yu...@gmail.com> wrote:
> Cycling old bits:
>
> http://search-hadoop.com/m/q3RTtHrxMj2abwOk2
>
> On Fri, Feb 19, 2016 at 6:40 PM, Arko Provo Mukherjee
> <ar...@gmail.com> wrote:
>>
>> Hi,
>>
>> Thanks for your response. Is there a similar link for Windows? I am
>> not sure the .sh scripts would run on windows.
>>
>> My default the start-all.sh doesn't work and I don't see anything in
>> localhos:8080
>>
>> I will do some more investigation and come back.
>>
>> Thanks again for all your help!
>>
>> Thanks & regards
>> Arko
>>
>>
>> On Fri, Feb 19, 2016 at 6:35 PM, Ted Yu <yu...@gmail.com> wrote:
>> > Please see https://spark.apache.org/docs/latest/spark-standalone.html
>> >
>> > On Fri, Feb 19, 2016 at 6:27 PM, Arko Provo Mukherjee
>> > <ar...@gmail.com> wrote:
>> >>
>> >> Hi,
>> >>
>> >> Thanks for your response, that really helped.
>> >>
>> >> However, I don't believe the job is being submitted. When I run spark
>> >> from the shell, I don't need to start it up explicitly. Do I need to
>> >> start up Spark on my machine before running this program?
>> >>
>> >> I see the following in the SPARK_HOME\bin directory:
>> >> Name
>> >> ----
>> >> beeline.cmd
>> >> load-spark-env.cmd
>> >> pyspark.cmd
>> >> pyspark2.cmd
>> >> run-example.cmd
>> >> run-example2.cmd
>> >> spark-class.cmd
>> >> spark-class2.cmd
>> >> spark-shell.cmd
>> >> spark-shell2.cmd
>> >> spark-submit.cmd
>> >> spark-submit2.cmd
>> >> sparkR.cmd
>> >> sparkR2.cmd
>> >>
>> >> Do I need to run anyone of them before submitting the job via the
>> >> program?
>> >>
>> >> Thanks & regards
>> >> Arko
>> >>
>> >> On Fri, Feb 19, 2016 at 6:01 PM, Holden Karau <ho...@pigscanfly.ca>
>> >> wrote:
>> >> > How are you trying to launch your application? Do you have the Spark
>> >> > jars on
>> >> > your class path?
>> >> >
>> >> >
>> >> > On Friday, February 19, 2016, Arko Provo Mukherjee
>> >> > <ar...@gmail.com> wrote:
>> >> >>
>> >> >> Hello,
>> >> >>
>> >> >> I am trying to submit a spark job via a program.
>> >> >>
>> >> >> When I run it, I receive the following error:
>> >> >> Exception in thread "Thread-1" java.lang.NoClassDefFoundError:
>> >> >> org/apache/spark/launcher/SparkLauncher
>> >> >> at Spark.SparkConnector.run(MySpark.scala:33)
>> >> >> at java.lang.Thread.run(Thread.java:745)
>> >> >> Caused by: java.lang.ClassNotFoundException:
>> >> >> org.apache.spark.launcher.SparkLauncher
>> >> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>> >> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>> >> >> at java.security.AccessController.doPrivileged(Native
>> >> >> Method)
>> >> >> at
>> >> >> java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>> >> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>> >> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>> >> >> ... 2 more
>> >> >>
>> >> >> It seems it cannot find the SparkLauncher class. Any clue to what I
>> >> >> am
>> >> >> doing wrong?
>> >> >>
>> >> >> Thanks & regards
>> >> >> Arko
>> >> >>
>> >> >>
>> >> >> ---------------------------------------------------------------------
>> >> >> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> >> >> For additional commands, e-mail: user-help@spark.apache.org
>> >> >>
>> >> >
>> >> >
>> >> > --
>> >> > Cell : 425-233-8271
>> >> > Twitter: https://twitter.com/holdenkarau
>> >> >
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> >> For additional commands, e-mail: user-help@spark.apache.org
>> >>
>> >
>
>
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
Re: Submitting Jobs Programmatically
Posted by Arko Provo Mukherjee <ar...@gmail.com>.
Hello,
Thanks much. I could start the service.
When I run my program, the launcher is not being able to find the app class:
java.lang.ClassNotFoundException: SparkSubmitter
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
Spark job complete. Exit code:101
at java.lang.Class.forName(Class.java:274)
at org.apache.spark.util.Utils$.classForName(Utils.scala:173)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:639)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
My launch code is as follows:
val spark = new SparkLauncher()
.setSparkHome("C:\\spark-1.5.1-bin-hadoop2.6")
.setAppResource("C:\\SparkService\\Scala\\RequestSubmitter\\target\\scala-2.10\\spark-submitter_2.10-0.0.1.jar")
.setMainClass("SparkSubmitter")
.addAppArgs(inputQuery)
.setMaster("spark://157.54.189.70:7077")
.launch()
spark.waitFor()
I added the spark-submitter_2.10-0.0.1.jar in the classpath as well
but that didn't help.
Thanks & regards
Arko
On Fri, Feb 19, 2016 at 6:49 PM, Ted Yu <yu...@gmail.com> wrote:
> Cycling old bits:
>
> http://search-hadoop.com/m/q3RTtHrxMj2abwOk2
>
> On Fri, Feb 19, 2016 at 6:40 PM, Arko Provo Mukherjee
> <ar...@gmail.com> wrote:
>>
>> Hi,
>>
>> Thanks for your response. Is there a similar link for Windows? I am
>> not sure the .sh scripts would run on windows.
>>
>> My default the start-all.sh doesn't work and I don't see anything in
>> localhos:8080
>>
>> I will do some more investigation and come back.
>>
>> Thanks again for all your help!
>>
>> Thanks & regards
>> Arko
>>
>>
>> On Fri, Feb 19, 2016 at 6:35 PM, Ted Yu <yu...@gmail.com> wrote:
>> > Please see https://spark.apache.org/docs/latest/spark-standalone.html
>> >
>> > On Fri, Feb 19, 2016 at 6:27 PM, Arko Provo Mukherjee
>> > <ar...@gmail.com> wrote:
>> >>
>> >> Hi,
>> >>
>> >> Thanks for your response, that really helped.
>> >>
>> >> However, I don't believe the job is being submitted. When I run spark
>> >> from the shell, I don't need to start it up explicitly. Do I need to
>> >> start up Spark on my machine before running this program?
>> >>
>> >> I see the following in the SPARK_HOME\bin directory:
>> >> Name
>> >> ----
>> >> beeline.cmd
>> >> load-spark-env.cmd
>> >> pyspark.cmd
>> >> pyspark2.cmd
>> >> run-example.cmd
>> >> run-example2.cmd
>> >> spark-class.cmd
>> >> spark-class2.cmd
>> >> spark-shell.cmd
>> >> spark-shell2.cmd
>> >> spark-submit.cmd
>> >> spark-submit2.cmd
>> >> sparkR.cmd
>> >> sparkR2.cmd
>> >>
>> >> Do I need to run anyone of them before submitting the job via the
>> >> program?
>> >>
>> >> Thanks & regards
>> >> Arko
>> >>
>> >> On Fri, Feb 19, 2016 at 6:01 PM, Holden Karau <ho...@pigscanfly.ca>
>> >> wrote:
>> >> > How are you trying to launch your application? Do you have the Spark
>> >> > jars on
>> >> > your class path?
>> >> >
>> >> >
>> >> > On Friday, February 19, 2016, Arko Provo Mukherjee
>> >> > <ar...@gmail.com> wrote:
>> >> >>
>> >> >> Hello,
>> >> >>
>> >> >> I am trying to submit a spark job via a program.
>> >> >>
>> >> >> When I run it, I receive the following error:
>> >> >> Exception in thread "Thread-1" java.lang.NoClassDefFoundError:
>> >> >> org/apache/spark/launcher/SparkLauncher
>> >> >> at Spark.SparkConnector.run(MySpark.scala:33)
>> >> >> at java.lang.Thread.run(Thread.java:745)
>> >> >> Caused by: java.lang.ClassNotFoundException:
>> >> >> org.apache.spark.launcher.SparkLauncher
>> >> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>> >> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>> >> >> at java.security.AccessController.doPrivileged(Native
>> >> >> Method)
>> >> >> at
>> >> >> java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>> >> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>> >> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>> >> >> ... 2 more
>> >> >>
>> >> >> It seems it cannot find the SparkLauncher class. Any clue to what I
>> >> >> am
>> >> >> doing wrong?
>> >> >>
>> >> >> Thanks & regards
>> >> >> Arko
>> >> >>
>> >> >>
>> >> >> ---------------------------------------------------------------------
>> >> >> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> >> >> For additional commands, e-mail: user-help@spark.apache.org
>> >> >>
>> >> >
>> >> >
>> >> > --
>> >> > Cell : 425-233-8271
>> >> > Twitter: https://twitter.com/holdenkarau
>> >> >
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> >> For additional commands, e-mail: user-help@spark.apache.org
>> >>
>> >
>
>
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
Re: Submitting Jobs Programmatically
Posted by Ted Yu <yu...@gmail.com>.
Cycling old bits:
http://search-hadoop.com/m/q3RTtHrxMj2abwOk2
On Fri, Feb 19, 2016 at 6:40 PM, Arko Provo Mukherjee <
arkoprovomukherjee@gmail.com> wrote:
> Hi,
>
> Thanks for your response. Is there a similar link for Windows? I am
> not sure the .sh scripts would run on windows.
>
> My default the start-all.sh doesn't work and I don't see anything in
> localhos:8080
>
> I will do some more investigation and come back.
>
> Thanks again for all your help!
>
> Thanks & regards
> Arko
>
>
> On Fri, Feb 19, 2016 at 6:35 PM, Ted Yu <yu...@gmail.com> wrote:
> > Please see https://spark.apache.org/docs/latest/spark-standalone.html
> >
> > On Fri, Feb 19, 2016 at 6:27 PM, Arko Provo Mukherjee
> > <ar...@gmail.com> wrote:
> >>
> >> Hi,
> >>
> >> Thanks for your response, that really helped.
> >>
> >> However, I don't believe the job is being submitted. When I run spark
> >> from the shell, I don't need to start it up explicitly. Do I need to
> >> start up Spark on my machine before running this program?
> >>
> >> I see the following in the SPARK_HOME\bin directory:
> >> Name
> >> ----
> >> beeline.cmd
> >> load-spark-env.cmd
> >> pyspark.cmd
> >> pyspark2.cmd
> >> run-example.cmd
> >> run-example2.cmd
> >> spark-class.cmd
> >> spark-class2.cmd
> >> spark-shell.cmd
> >> spark-shell2.cmd
> >> spark-submit.cmd
> >> spark-submit2.cmd
> >> sparkR.cmd
> >> sparkR2.cmd
> >>
> >> Do I need to run anyone of them before submitting the job via the
> program?
> >>
> >> Thanks & regards
> >> Arko
> >>
> >> On Fri, Feb 19, 2016 at 6:01 PM, Holden Karau <ho...@pigscanfly.ca>
> >> wrote:
> >> > How are you trying to launch your application? Do you have the Spark
> >> > jars on
> >> > your class path?
> >> >
> >> >
> >> > On Friday, February 19, 2016, Arko Provo Mukherjee
> >> > <ar...@gmail.com> wrote:
> >> >>
> >> >> Hello,
> >> >>
> >> >> I am trying to submit a spark job via a program.
> >> >>
> >> >> When I run it, I receive the following error:
> >> >> Exception in thread "Thread-1" java.lang.NoClassDefFoundError:
> >> >> org/apache/spark/launcher/SparkLauncher
> >> >> at Spark.SparkConnector.run(MySpark.scala:33)
> >> >> at java.lang.Thread.run(Thread.java:745)
> >> >> Caused by: java.lang.ClassNotFoundException:
> >> >> org.apache.spark.launcher.SparkLauncher
> >> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> >> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >> >> at java.security.AccessController.doPrivileged(Native Method)
> >> >> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >> >> ... 2 more
> >> >>
> >> >> It seems it cannot find the SparkLauncher class. Any clue to what I
> am
> >> >> doing wrong?
> >> >>
> >> >> Thanks & regards
> >> >> Arko
> >> >>
> >> >> ---------------------------------------------------------------------
> >> >> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> >> >> For additional commands, e-mail: user-help@spark.apache.org
> >> >>
> >> >
> >> >
> >> > --
> >> > Cell : 425-233-8271
> >> > Twitter: https://twitter.com/holdenkarau
> >> >
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> >> For additional commands, e-mail: user-help@spark.apache.org
> >>
> >
>
Re: Submitting Jobs Programmatically
Posted by Arko Provo Mukherjee <ar...@gmail.com>.
Hi,
Thanks for your response. Is there a similar link for Windows? I am
not sure the .sh scripts would run on windows.
My default the start-all.sh doesn't work and I don't see anything in
localhos:8080
I will do some more investigation and come back.
Thanks again for all your help!
Thanks & regards
Arko
On Fri, Feb 19, 2016 at 6:35 PM, Ted Yu <yu...@gmail.com> wrote:
> Please see https://spark.apache.org/docs/latest/spark-standalone.html
>
> On Fri, Feb 19, 2016 at 6:27 PM, Arko Provo Mukherjee
> <ar...@gmail.com> wrote:
>>
>> Hi,
>>
>> Thanks for your response, that really helped.
>>
>> However, I don't believe the job is being submitted. When I run spark
>> from the shell, I don't need to start it up explicitly. Do I need to
>> start up Spark on my machine before running this program?
>>
>> I see the following in the SPARK_HOME\bin directory:
>> Name
>> ----
>> beeline.cmd
>> load-spark-env.cmd
>> pyspark.cmd
>> pyspark2.cmd
>> run-example.cmd
>> run-example2.cmd
>> spark-class.cmd
>> spark-class2.cmd
>> spark-shell.cmd
>> spark-shell2.cmd
>> spark-submit.cmd
>> spark-submit2.cmd
>> sparkR.cmd
>> sparkR2.cmd
>>
>> Do I need to run anyone of them before submitting the job via the program?
>>
>> Thanks & regards
>> Arko
>>
>> On Fri, Feb 19, 2016 at 6:01 PM, Holden Karau <ho...@pigscanfly.ca>
>> wrote:
>> > How are you trying to launch your application? Do you have the Spark
>> > jars on
>> > your class path?
>> >
>> >
>> > On Friday, February 19, 2016, Arko Provo Mukherjee
>> > <ar...@gmail.com> wrote:
>> >>
>> >> Hello,
>> >>
>> >> I am trying to submit a spark job via a program.
>> >>
>> >> When I run it, I receive the following error:
>> >> Exception in thread "Thread-1" java.lang.NoClassDefFoundError:
>> >> org/apache/spark/launcher/SparkLauncher
>> >> at Spark.SparkConnector.run(MySpark.scala:33)
>> >> at java.lang.Thread.run(Thread.java:745)
>> >> Caused by: java.lang.ClassNotFoundException:
>> >> org.apache.spark.launcher.SparkLauncher
>> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>> >> at java.security.AccessController.doPrivileged(Native Method)
>> >> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>> >> ... 2 more
>> >>
>> >> It seems it cannot find the SparkLauncher class. Any clue to what I am
>> >> doing wrong?
>> >>
>> >> Thanks & regards
>> >> Arko
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> >> For additional commands, e-mail: user-help@spark.apache.org
>> >>
>> >
>> >
>> > --
>> > Cell : 425-233-8271
>> > Twitter: https://twitter.com/holdenkarau
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
Re: Submitting Jobs Programmatically
Posted by Ted Yu <yu...@gmail.com>.
Please see https://spark.apache.org/docs/latest/spark-standalone.html
On Fri, Feb 19, 2016 at 6:27 PM, Arko Provo Mukherjee <
arkoprovomukherjee@gmail.com> wrote:
> Hi,
>
> Thanks for your response, that really helped.
>
> However, I don't believe the job is being submitted. When I run spark
> from the shell, I don't need to start it up explicitly. Do I need to
> start up Spark on my machine before running this program?
>
> I see the following in the SPARK_HOME\bin directory:
> Name
> ----
> beeline.cmd
> load-spark-env.cmd
> pyspark.cmd
> pyspark2.cmd
> run-example.cmd
> run-example2.cmd
> spark-class.cmd
> spark-class2.cmd
> spark-shell.cmd
> spark-shell2.cmd
> spark-submit.cmd
> spark-submit2.cmd
> sparkR.cmd
> sparkR2.cmd
>
> Do I need to run anyone of them before submitting the job via the program?
>
> Thanks & regards
> Arko
>
> On Fri, Feb 19, 2016 at 6:01 PM, Holden Karau <ho...@pigscanfly.ca>
> wrote:
> > How are you trying to launch your application? Do you have the Spark
> jars on
> > your class path?
> >
> >
> > On Friday, February 19, 2016, Arko Provo Mukherjee
> > <ar...@gmail.com> wrote:
> >>
> >> Hello,
> >>
> >> I am trying to submit a spark job via a program.
> >>
> >> When I run it, I receive the following error:
> >> Exception in thread "Thread-1" java.lang.NoClassDefFoundError:
> >> org/apache/spark/launcher/SparkLauncher
> >> at Spark.SparkConnector.run(MySpark.scala:33)
> >> at java.lang.Thread.run(Thread.java:745)
> >> Caused by: java.lang.ClassNotFoundException:
> >> org.apache.spark.launcher.SparkLauncher
> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >> at java.security.AccessController.doPrivileged(Native Method)
> >> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >> ... 2 more
> >>
> >> It seems it cannot find the SparkLauncher class. Any clue to what I am
> >> doing wrong?
> >>
> >> Thanks & regards
> >> Arko
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> >> For additional commands, e-mail: user-help@spark.apache.org
> >>
> >
> >
> > --
> > Cell : 425-233-8271
> > Twitter: https://twitter.com/holdenkarau
> >
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>
Re: Submitting Jobs Programmatically
Posted by Arko Provo Mukherjee <ar...@gmail.com>.
Hi,
Thanks for your response, that really helped.
However, I don't believe the job is being submitted. When I run spark
from the shell, I don't need to start it up explicitly. Do I need to
start up Spark on my machine before running this program?
I see the following in the SPARK_HOME\bin directory:
Name
----
beeline.cmd
load-spark-env.cmd
pyspark.cmd
pyspark2.cmd
run-example.cmd
run-example2.cmd
spark-class.cmd
spark-class2.cmd
spark-shell.cmd
spark-shell2.cmd
spark-submit.cmd
spark-submit2.cmd
sparkR.cmd
sparkR2.cmd
Do I need to run anyone of them before submitting the job via the program?
Thanks & regards
Arko
On Fri, Feb 19, 2016 at 6:01 PM, Holden Karau <ho...@pigscanfly.ca> wrote:
> How are you trying to launch your application? Do you have the Spark jars on
> your class path?
>
>
> On Friday, February 19, 2016, Arko Provo Mukherjee
> <ar...@gmail.com> wrote:
>>
>> Hello,
>>
>> I am trying to submit a spark job via a program.
>>
>> When I run it, I receive the following error:
>> Exception in thread "Thread-1" java.lang.NoClassDefFoundError:
>> org/apache/spark/launcher/SparkLauncher
>> at Spark.SparkConnector.run(MySpark.scala:33)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.ClassNotFoundException:
>> org.apache.spark.launcher.SparkLauncher
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>> ... 2 more
>>
>> It seems it cannot find the SparkLauncher class. Any clue to what I am
>> doing wrong?
>>
>> Thanks & regards
>> Arko
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>
>
> --
> Cell : 425-233-8271
> Twitter: https://twitter.com/holdenkarau
>
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org
Re: Submitting Jobs Programmatically
Posted by Holden Karau <ho...@pigscanfly.ca>.
How are you trying to launch your application? Do you have the Spark jars
on your class path?
On Friday, February 19, 2016, Arko Provo Mukherjee <
arkoprovomukherjee@gmail.com> wrote:
> Hello,
>
> I am trying to submit a spark job via a program.
>
> When I run it, I receive the following error:
> Exception in thread "Thread-1" java.lang.NoClassDefFoundError:
> org/apache/spark/launcher/SparkLauncher
> at Spark.SparkConnector.run(MySpark.scala:33)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.launcher.SparkLauncher
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> ... 2 more
>
> It seems it cannot find the SparkLauncher class. Any clue to what I am
> doing wrong?
>
> Thanks & regards
> Arko
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org <javascript:;>
> For additional commands, e-mail: user-help@spark.apache.org <javascript:;>
>
>
--
Cell : 425-233-8271
Twitter: https://twitter.com/holdenkarau