You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by boci <bo...@gmail.com> on 2014/07/19 15:30:49 UTC

Uber jar with SBT

Hi Guys,

I try to create spark uber jar with sbt but I have a lot of problem... I
want to use the following:
- Spark streaming
- Kafka
- Elsaticsearch
- HBase

the current jar size is cca 60M and it's not working.
- When I deploy with spark-submit: It's running and exit without any error
- When I try to start with local[*]  mode, it's say:
 Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/spark/Logging
=> but I start with java -cp /.../spark-assembly-1.0.1-hadoop2.2.0.jar -jar
my.jar

Any idea how can solve this? (which lib required to set provided wich
required for run... later I want to run this jar in yarn cluster)

b0c1
----------------------------------------------------------------------------------------------------------------------------------
Skype: boci13, Hangout: boci.boci@gmail.com

Re: Uber jar with SBT

Posted by Tathagata Das <ta...@gmail.com>.
Just to confirm, are you interested in submitting the spark job inside the
cluster of the spark standalone mode (that is, one of the workers will be
running the driver)? For that, spark-submit does support that fully yet.
You can probably use the instructions present in Spark 0.9.1 to do that.

Regarding spark-submit's behavior, that is the expected behavior.
Spark-submit waits for the driver program to terminate.

TD


On Sat, Jul 19, 2014 at 7:34 AM, boci <bo...@gmail.com> wrote:

> Hi!
>
> I using java7, I found the problem. I not run start and await termination
> on streaming context, now it's work BUT
> spark-submit never return (it's run in the foreground and receive the
> kafka streams)... what I miss?
> (I want to send the job to standalone cluster worker process)
>
> b0c1
>
>
> ----------------------------------------------------------------------------------------------------------------------------------
> Skype: boci13, Hangout: boci.boci@gmail.com
>
>
> On Sat, Jul 19, 2014 at 3:32 PM, Sean Owen <so...@cloudera.com> wrote:
>
>> Are you building / running with Java 6? I imagine your .jar files has
>> more than 65536 files, and Java 6 has various issues with jars this
>> large. If possible, use Java 7 everywhere.
>>
>> https://issues.apache.org/jira/browse/SPARK-1520
>>
>> On Sat, Jul 19, 2014 at 2:30 PM, boci <bo...@gmail.com> wrote:
>> > Hi Guys,
>> >
>> > I try to create spark uber jar with sbt but I have a lot of problem... I
>> > want to use the following:
>> > - Spark streaming
>> > - Kafka
>> > - Elsaticsearch
>> > - HBase
>> >
>> > the current jar size is cca 60M and it's not working.
>> > - When I deploy with spark-submit: It's running and exit without any
>> error
>> > - When I try to start with local[*]  mode, it's say:
>> >  Exception in thread "main" java.lang.NoClassDefFoundError:
>> > org/apache/spark/Logging
>> > => but I start with java -cp /.../spark-assembly-1.0.1-hadoop2.2.0.jar
>> -jar
>> > my.jar
>> >
>> > Any idea how can solve this? (which lib required to set provided wich
>> > required for run... later I want to run this jar in yarn cluster)
>> >
>> > b0c1
>> >
>> ----------------------------------------------------------------------------------------------------------------------------------
>> > Skype: boci13, Hangout: boci.boci@gmail.com
>>
>
>

Re: Uber jar with SBT

Posted by boci <bo...@gmail.com>.
Hi!

I using java7, I found the problem. I not run start and await termination
on streaming context, now it's work BUT
spark-submit never return (it's run in the foreground and receive the kafka
streams)... what I miss?
(I want to send the job to standalone cluster worker process)

b0c1

----------------------------------------------------------------------------------------------------------------------------------
Skype: boci13, Hangout: boci.boci@gmail.com


On Sat, Jul 19, 2014 at 3:32 PM, Sean Owen <so...@cloudera.com> wrote:

> Are you building / running with Java 6? I imagine your .jar files has
> more than 65536 files, and Java 6 has various issues with jars this
> large. If possible, use Java 7 everywhere.
>
> https://issues.apache.org/jira/browse/SPARK-1520
>
> On Sat, Jul 19, 2014 at 2:30 PM, boci <bo...@gmail.com> wrote:
> > Hi Guys,
> >
> > I try to create spark uber jar with sbt but I have a lot of problem... I
> > want to use the following:
> > - Spark streaming
> > - Kafka
> > - Elsaticsearch
> > - HBase
> >
> > the current jar size is cca 60M and it's not working.
> > - When I deploy with spark-submit: It's running and exit without any
> error
> > - When I try to start with local[*]  mode, it's say:
> >  Exception in thread "main" java.lang.NoClassDefFoundError:
> > org/apache/spark/Logging
> > => but I start with java -cp /.../spark-assembly-1.0.1-hadoop2.2.0.jar
> -jar
> > my.jar
> >
> > Any idea how can solve this? (which lib required to set provided wich
> > required for run... later I want to run this jar in yarn cluster)
> >
> > b0c1
> >
> ----------------------------------------------------------------------------------------------------------------------------------
> > Skype: boci13, Hangout: boci.boci@gmail.com
>

Re: Uber jar with SBT

Posted by Sean Owen <so...@cloudera.com>.
Are you building / running with Java 6? I imagine your .jar files has
more than 65536 files, and Java 6 has various issues with jars this
large. If possible, use Java 7 everywhere.

https://issues.apache.org/jira/browse/SPARK-1520

On Sat, Jul 19, 2014 at 2:30 PM, boci <bo...@gmail.com> wrote:
> Hi Guys,
>
> I try to create spark uber jar with sbt but I have a lot of problem... I
> want to use the following:
> - Spark streaming
> - Kafka
> - Elsaticsearch
> - HBase
>
> the current jar size is cca 60M and it's not working.
> - When I deploy with spark-submit: It's running and exit without any error
> - When I try to start with local[*]  mode, it's say:
>  Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/spark/Logging
> => but I start with java -cp /.../spark-assembly-1.0.1-hadoop2.2.0.jar -jar
> my.jar
>
> Any idea how can solve this? (which lib required to set provided wich
> required for run... later I want to run this jar in yarn cluster)
>
> b0c1
> ----------------------------------------------------------------------------------------------------------------------------------
> Skype: boci13, Hangout: boci.boci@gmail.com