You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Tomoya Igarashi <to...@gmail.com> on 2014/12/15 03:12:17 UTC

Run Spark job on Playframework + Spark Master/Worker in one Mac

Hi all,

I am trying to run Spark job on Playframework + Spark Master/Worker in one
Mac.
When job ran, I encountered java.lang.ClassNotFoundException.
Would you teach me how to solve it?

Here is my code in Github.
https://github.com/TomoyaIgarashi/spark_cluster_sample

* Envrionments:
Mac 10.9.5
Java 1.7.0_71
Playframework 2.2.3
Spark 1.1.1

* Setup history:
> cd ~
> git clone git@github.com:apache/spark.git
> cd spark
> git checkout -b v1.1.1 v1.1.1
> sbt/sbt assembly
> vi ~/.bashrc
export SPARK_HOME=/Users/tomoya/spark
> . ~/.bashrc
> hostname
Tomoya-Igarashis-MacBook-Air.local
> vi $SPARK_HOME/conf/slaves
Tomoya-Igarashis-MacBook-Air.local
> play new spark_cluster_sample
default name
type -> scala

* Run history:
> $SPARK_HOME/sbin/start-all.sh
> jps
> which play
/Users/tomoya/play/play
> git clone https://github.com/TomoyaIgarashi/spark_cluster_sample
> cd spark_cluster_sample
> play run

* Error trace:
Here is error trace in Gist.
https://gist.github.com/TomoyaIgarashi/4bd45ab3685a532f5511

Regards

Re: Run Spark job on Playframework + Spark Master/Worker in one Mac

Posted by Tomoya Igarashi <to...@gmail.com>.
Thanks for response.

Yes, I am using standalone mode.

I couldn't find any errors. But, "WARN" messages appear in Spark master
logs.
Here is Spark master logs.
https://gist.github.com/TomoyaIgarashi/72145c11d3769c7d1ddb

FYI
Here is Spark worker logs.
https://gist.github.com/TomoyaIgarashi/0db77e93cacb4a93aa1f
Here is Playframework logs.
https://gist.github.com/TomoyaIgarashi/9688bdd5663af95ddd4d

If you have any comments, please let us know.

Regards


2014-12-16 15:34 GMT+09:00 Aniket Bhatnagar <an...@gmail.com>:
>
> Seems you are using standalone mode. Can you check spark worker logs or
> application logs in spark work directory to find any errors?
>
>
> On Tue, Dec 16, 2014, 9:09 AM Tomoya Igarashi <
> tomoya.igarashi.0510@gmail.com> wrote:
>
>> Hi Aniket,
>> Thanks for your reply.
>>
>> I followed your advice to modified my code.
>> Here is latest one.
>>
>> https://github.com/TomoyaIgarashi/spark_cluster_sample/commit/ce7613c42d3adbe6ae44e264c11f3829460f3c35
>>
>> As a result, It works correctly! Thank you very much.
>>
>> But, "AssociationError" Message appears line 397 in Playframework logs as
>> follows.
>> https://gist.github.com/TomoyaIgarashi/9688bdd5663af95ddd4d
>>
>> Is there any problem?
>>
>>
>> 2014-12-15 18:48 GMT+09:00 Aniket Bhatnagar <an...@gmail.com>:
>>>
>>> Try the workaround (addClassPathJars(sparkContext,
>>> this.getClass.getClassLoader) discussed in
>>> http://mail-archives.apache.org/mod_mbox/spark-user/201412.mbox/%3CCAJOb8buD1B6tUtOfG8_Ok7F95C3=r-ZqgFfOQsqBJDXd427Vvw@mail.gmail.com%3E
>>>
>>> Thanks,
>>> Aniket
>>>
>>>
>>> On Mon Dec 15 2014 at 07:43:24 Tomoya Igarashi <
>>> tomoya.igarashi.0510@gmail.com> wrote:
>>>
>>>> Hi all,
>>>>
>>>> I am trying to run Spark job on Playframework + Spark Master/Worker in
>>>> one Mac.
>>>> When job ran, I encountered java.lang.ClassNotFoundException.
>>>> Would you teach me how to solve it?
>>>>
>>>> Here is my code in Github.
>>>> https://github.com/TomoyaIgarashi/spark_cluster_sample
>>>>
>>>> * Envrionments:
>>>> Mac 10.9.5
>>>> Java 1.7.0_71
>>>> Playframework 2.2.3
>>>> Spark 1.1.1
>>>>
>>>> * Setup history:
>>>> > cd ~
>>>> > git clone git@github.com:apache/spark.git
>>>> > cd spark
>>>> > git checkout -b v1.1.1 v1.1.1
>>>> > sbt/sbt assembly
>>>> > vi ~/.bashrc
>>>> export SPARK_HOME=/Users/tomoya/spark
>>>> > . ~/.bashrc
>>>> > hostname
>>>> Tomoya-Igarashis-MacBook-Air.local
>>>> > vi $SPARK_HOME/conf/slaves
>>>> Tomoya-Igarashis-MacBook-Air.local
>>>> > play new spark_cluster_sample
>>>> default name
>>>> type -> scala
>>>>
>>>> * Run history:
>>>> > $SPARK_HOME/sbin/start-all.sh
>>>> > jps
>>>> > which play
>>>> /Users/tomoya/play/play
>>>> > git clone https://github.com/TomoyaIgarashi/spark_cluster_sample
>>>> > cd spark_cluster_sample
>>>> > play run
>>>>
>>>> * Error trace:
>>>> Here is error trace in Gist.
>>>> https://gist.github.com/TomoyaIgarashi/4bd45ab3685a532f5511
>>>>
>>>> Regards
>>>>
>>>

Re: Run Spark job on Playframework + Spark Master/Worker in one Mac

Posted by Aniket Bhatnagar <an...@gmail.com>.
Seems you are using standalone mode. Can you check spark worker logs or
application logs in spark work directory to find any errors?

On Tue, Dec 16, 2014, 9:09 AM Tomoya Igarashi <
tomoya.igarashi.0510@gmail.com> wrote:

> Hi Aniket,
> Thanks for your reply.
>
> I followed your advice to modified my code.
> Here is latest one.
>
> https://github.com/TomoyaIgarashi/spark_cluster_sample/commit/ce7613c42d3adbe6ae44e264c11f3829460f3c35
>
> As a result, It works correctly! Thank you very much.
>
> But, "AssociationError" Message appears line 397 in Playframework logs as
> follows.
> https://gist.github.com/TomoyaIgarashi/9688bdd5663af95ddd4d
>
> Is there any problem?
>
>
> 2014-12-15 18:48 GMT+09:00 Aniket Bhatnagar <an...@gmail.com>:
>>
>> Try the workaround (addClassPathJars(sparkContext,
>> this.getClass.getClassLoader) discussed in
>> http://mail-archives.apache.org/mod_mbox/spark-user/201412.mbox/%3CCAJOb8buD1B6tUtOfG8_Ok7F95C3=r-ZqgFfOQsqBJDXd427Vvw@mail.gmail.com%3E
>>
>> Thanks,
>> Aniket
>>
>>
>> On Mon Dec 15 2014 at 07:43:24 Tomoya Igarashi <
>> tomoya.igarashi.0510@gmail.com> wrote:
>>
>>> Hi all,
>>>
>>> I am trying to run Spark job on Playframework + Spark Master/Worker in
>>> one Mac.
>>> When job ran, I encountered java.lang.ClassNotFoundException.
>>> Would you teach me how to solve it?
>>>
>>> Here is my code in Github.
>>> https://github.com/TomoyaIgarashi/spark_cluster_sample
>>>
>>> * Envrionments:
>>> Mac 10.9.5
>>> Java 1.7.0_71
>>> Playframework 2.2.3
>>> Spark 1.1.1
>>>
>>> * Setup history:
>>> > cd ~
>>> > git clone git@github.com:apache/spark.git
>>> > cd spark
>>> > git checkout -b v1.1.1 v1.1.1
>>> > sbt/sbt assembly
>>> > vi ~/.bashrc
>>> export SPARK_HOME=/Users/tomoya/spark
>>> > . ~/.bashrc
>>> > hostname
>>> Tomoya-Igarashis-MacBook-Air.local
>>> > vi $SPARK_HOME/conf/slaves
>>> Tomoya-Igarashis-MacBook-Air.local
>>> > play new spark_cluster_sample
>>> default name
>>> type -> scala
>>>
>>> * Run history:
>>> > $SPARK_HOME/sbin/start-all.sh
>>> > jps
>>> > which play
>>> /Users/tomoya/play/play
>>> > git clone https://github.com/TomoyaIgarashi/spark_cluster_sample
>>> > cd spark_cluster_sample
>>> > play run
>>>
>>> * Error trace:
>>> Here is error trace in Gist.
>>> https://gist.github.com/TomoyaIgarashi/4bd45ab3685a532f5511
>>>
>>> Regards
>>>
>>

Fwd: Run Spark job on Playframework + Spark Master/Worker in one Mac

Posted by Tomoya Igarashi <to...@gmail.com>.
Hi Aniket,
Thanks for your reply.

I followed your advice to modified my code.
Here is latest one.
https://github.com/TomoyaIgarashi/spark_cluster_sample/commit/ce7613c42d3adbe6ae44e264c11f3829460f3c35

As a result, It works correctly! Thank you very much.

But, "AssociationError" Message appears line 397 in Playframework logs as
follows.
https://gist.github.com/TomoyaIgarashi/9688bdd5663af95ddd4d

Is there any problem?


2014-12-15 18:48 GMT+09:00 Aniket Bhatnagar <an...@gmail.com>:
>
> Try the workaround (addClassPathJars(sparkContext,
> this.getClass.getClassLoader) discussed in
> http://mail-archives.apache.org/mod_mbox/spark-user/201412.mbox/%3CCAJOb8buD1B6tUtOfG8_Ok7F95C3=r-ZqgFfOQsqBJDXd427Vvw@mail.gmail.com%3E
>
> Thanks,
> Aniket
>
>
> On Mon Dec 15 2014 at 07:43:24 Tomoya Igarashi <
> tomoya.igarashi.0510@gmail.com> wrote:
>
>> Hi all,
>>
>> I am trying to run Spark job on Playframework + Spark Master/Worker in
>> one Mac.
>> When job ran, I encountered java.lang.ClassNotFoundException.
>> Would you teach me how to solve it?
>>
>> Here is my code in Github.
>> https://github.com/TomoyaIgarashi/spark_cluster_sample
>>
>> * Envrionments:
>> Mac 10.9.5
>> Java 1.7.0_71
>> Playframework 2.2.3
>> Spark 1.1.1
>>
>> * Setup history:
>> > cd ~
>> > git clone git@github.com:apache/spark.git
>> > cd spark
>> > git checkout -b v1.1.1 v1.1.1
>> > sbt/sbt assembly
>> > vi ~/.bashrc
>> export SPARK_HOME=/Users/tomoya/spark
>> > . ~/.bashrc
>> > hostname
>> Tomoya-Igarashis-MacBook-Air.local
>> > vi $SPARK_HOME/conf/slaves
>> Tomoya-Igarashis-MacBook-Air.local
>> > play new spark_cluster_sample
>> default name
>> type -> scala
>>
>> * Run history:
>> > $SPARK_HOME/sbin/start-all.sh
>> > jps
>> > which play
>> /Users/tomoya/play/play
>> > git clone https://github.com/TomoyaIgarashi/spark_cluster_sample
>> > cd spark_cluster_sample
>> > play run
>>
>> * Error trace:
>> Here is error trace in Gist.
>> https://gist.github.com/TomoyaIgarashi/4bd45ab3685a532f5511
>>
>> Regards
>>
>

Re: Run Spark job on Playframework + Spark Master/Worker in one Mac

Posted by Aniket Bhatnagar <an...@gmail.com>.
Try the workaround (addClassPathJars(sparkContext,
this.getClass.getClassLoader) discussed in
http://mail-archives.apache.org/mod_mbox/spark-user/201412.mbox/%3CCAJOb8buD1B6tUtOfG8_Ok7F95C3=r-ZqgFfOQsqBJDXd427Vvw@mail.gmail.com%3E

Thanks,
Aniket

On Mon Dec 15 2014 at 07:43:24 Tomoya Igarashi <
tomoya.igarashi.0510@gmail.com> wrote:

> Hi all,
>
> I am trying to run Spark job on Playframework + Spark Master/Worker in one
> Mac.
> When job ran, I encountered java.lang.ClassNotFoundException.
> Would you teach me how to solve it?
>
> Here is my code in Github.
> https://github.com/TomoyaIgarashi/spark_cluster_sample
>
> * Envrionments:
> Mac 10.9.5
> Java 1.7.0_71
> Playframework 2.2.3
> Spark 1.1.1
>
> * Setup history:
> > cd ~
> > git clone git@github.com:apache/spark.git
> > cd spark
> > git checkout -b v1.1.1 v1.1.1
> > sbt/sbt assembly
> > vi ~/.bashrc
> export SPARK_HOME=/Users/tomoya/spark
> > . ~/.bashrc
> > hostname
> Tomoya-Igarashis-MacBook-Air.local
> > vi $SPARK_HOME/conf/slaves
> Tomoya-Igarashis-MacBook-Air.local
> > play new spark_cluster_sample
> default name
> type -> scala
>
> * Run history:
> > $SPARK_HOME/sbin/start-all.sh
> > jps
> > which play
> /Users/tomoya/play/play
> > git clone https://github.com/TomoyaIgarashi/spark_cluster_sample
> > cd spark_cluster_sample
> > play run
>
> * Error trace:
> Here is error trace in Gist.
> https://gist.github.com/TomoyaIgarashi/4bd45ab3685a532f5511
>
> Regards
>